Abstract: A method for tracking a cinematography target that has been associated with an emitter can comprise receiving an indication to track a particular identifier. The particular identifier can be associated with an object of interest. The method can further comprise identifying, using at least one tracker component, a direction associated with the particular identifier. The method can also include calculating a motor actuation sequence necessary to actuate a control component to track the object of interest with an audiovisual device. The method can further comprise actuating at least one motor to track the object of interest.
Type:
Grant
Filed:
September 30, 2014
Date of Patent:
July 4, 2017
Assignee:
Jigabot, LLC.
Inventors:
Richard F. Stout, Kyle K. Johnson, Donna M. Root, Kevin J. Shelley
Abstract: In a video recording environment, a compact, rugged, intelligent tracking apparatus and method enables the automation of labor-intensive operating of cameras, lights, microphones and other devices. Auto-framing of a tracked object within the viewfinder of a supported camera is possible. The device can sense more than one object at once, and includes multiple ways to easy way to switch from one object to another. The methods show how the auto-framing device can be “predictive” of movements, intelligently smooth the tilt and swivel motions so that the end effect is a professional looking picture or video. It is designed to be uniquely small yet rugged and waterproof. And it can accept configuration input from users via a smartphone or extreme-sports camera over wi-fi or bluetooth, including user-programmable scripts that automate the device functionality in easy to use ways.
Type:
Grant
Filed:
October 3, 2013
Date of Patent:
July 4, 2017
Assignee:
Jigabot, LLC.
Inventors:
Richard F. Stout, Kyle K. Johnson, Cameron Engh, Kevin J. Shelley
Abstract: In a computerized film (digital camera action) production system, a virtual director module takes responsibility for creating sets, and enabling cinematography by creating a synthetic parallax matching of a “set” image to what would have existed before a camera in real motion about the physical location represented. This overcomes failures typical of “green screen” technology in dealing with such conditions as shallow depth of field and low light. Computer generated, synthetic sets from a 3D animator may be projected behind real actors during or after filming. Sythetic sets are seen through a virtual camera as photo-realistic because they are skinned with textures, and structured geometrically, to match reality represented by actual still photography of a physical environment, such as buildings, walls, streets, landscapes, props, and the like.