Applied Apple Vision algorithms to track an object from live capture and keep it center in frame by articulating the Movi Cinema Robot.
This project started out as a timed code challenge for Freefly Systems. After realizing the value of the feature, I decided to evolve the core idea into a standalone utility App for the Movi Cinema Robot.
Design Considerations
Building this app forced me to dive deep into the world of Apple Vision algorithms. My goal for this was to achieve high performance, real time object tracking that paired natively with the Movi Cinema Robot.
Overall, this project stemmed from my fascination with limitless camera control; specifically, autonomous camera control. A few years ago, I built a five-axis motion control system [pan, tilt, zoom, focus, track] that featured the ability to program a series of movements and accurately execute them to timecode. This essentially allowed professional grade solo recording without a camera operator. While that rig was most certainly overkill, I had a similar intention with this: I wanted to build something that could fill in for a human camera operator if necessary. To achieve this, I implemented voice recognition and a specific feature set that facilitates professional remote control with a prosumer / consumer product.