A curious and diligent software engineer. The sole employee of Briar Patch Software as an R&D scientist seeking to apply machine learning techniques to mobile application development.
As part of the small SRE team at FocusVision, we were essential to improving and maintaining the organization’s systems and productivity. We worked side-by-side with almost every role in the company, focusing on the development, deployment, maintenance, security, automation, and documentation of our infrastructure, tooling, and operations. In pursuit of 99.9% availability and a self-healing, service-oriented architecture – we worked smart each day, learning perpetually and implementing boring solutions with clever tools to reduce complex problems.
As a full-stack engineer part of an agile team, we used iterative and test-driven approaches to develop and maintain features in our production systems. Made large contributions to the design, implementation, and testing of our back- and front-end systems, APIs, and deployment pipeline used to support Decipher’s award-winning survey building and data collection platform.
We, a team of two, worked closely with the entire organization and its customers to develop the documentation necessary for all involved to properly use Decipher’s software systems. During these few months, I gained a wholesome understanding of how to compile the businesses’ and engineers’ specifications into concise, actionable, and accessible online documentation. Author of the Decipher Programming Manual, a comprehensive guide on Decipher survey programming.
Developed a system we called FALT (Fresno Audiovisual Lexicon Tool) with Professor Lorin Lachs, Ph.D. of Psychology on behalf of the National Science Foundation to perform lexical analysis on the similarities of phonemic and visemic communication.
As a first responder to Fresno State’s students and staff, I provided technical assistance via telephony systems and support ticket submissions.
As the founder (and only employee) of Briar Patch Software, we’re in the early stages of applying machine learning techniques to mobile application development. Primarily, we aim to deliver great software experiences through useful applications of solutions to problems in time series classification, computer vision, and optical character recognition.
Designed for iOS/Android mobile devices, Flynt uses the device’s accelerometer and gyroscope to detect and train free-motion gestures to create an intuitive interface for counting things. To use Flynt, the user places their thumb on the object’s label they wish to count and makes the gesture to update its count. Our early trials demonstrate that Flynt is significantly easier, quicker, and more efficient than traditional approaches. By default, Flynt is equipped with a simple flick gesture that was modeled using a semi-supervised machine learning classification technique – the user can train new gestures using the simple 3-step process in Flynt’s Training mode.
Designed for iOS/Android mobile devices, Booksee is a tool to find and catalog books on a bookshelf. Using the stream of images from the device’s camera, Booksee automagically stitches together and generates a list of all the books it’s able to see sitting on a bookshelf by the details of the books’ seams. Machine learning, computer vision, and object character recognition techniques allow Booksee to help relieve that slow and painful, neck-kinked approach to discovering books on a bookshelf. If enabled, Booksee can also help connect readers to find the titles they’re looking for – and even connect with Amazon’s APIs to show the current sale price.