Patents Assigned to CAST Group of Companies Inc.
-
Patent number: 9350923Abstract: Systems and methods are provided for tracking at least position and angular orientation. The system comprises a computing device in communication with at least two cameras, wherein each of the cameras are able to capture images of one or more light sources attached to an object. A receiver is in communication with the computing device, wherein the receiver is able to receive at least angular orientation data associated with the object. The computing device determines the object's position by comparing images of the light sources and generates an output comprising the position and angular orientation of the object.Type: GrantFiled: July 21, 2014Date of Patent: May 24, 2016Assignee: CAST Group of Companies Inc.Inventors: Gilray Densham, Justin Eichel
-
Patent number: 9317959Abstract: A system and a method are provided for visualizing virtual objects on a mobile device. A computing device is in communication with the mobile device. The computing device generates a 3D virtual world of one or more virtual objects corresponding to one or more physical objects in a real world. The computing device then associates information with the one or more virtual objects and generates one or more static images based on the 3D virtual world. The mobile device receives the one or more static images and the associated information associated from the computing device, and then displays the one or more static images.Type: GrantFiled: February 28, 2014Date of Patent: April 19, 2016Assignee: CAST Group of Companies Inc.Inventor: Gilray Densham
-
Patent number: 9055226Abstract: Systems and methods are provided for using tracking data to control the functions of an automated fixture. Examples of automated fixtures include light fixtures and camera fixtures. A method includes obtaining a first position of a tracking unit. The tracking unit includes an inertial measurement unit and a visual indicator configured to be tracked by a camera. A first distance is computed between the automated fixture and the first position and it is used to set a function of the automated fixture to a first setting. A second position of the tracking unit is obtained. A second distance between the automated fixture and the second position is computed, and the second distance is used to set the function of the automated fixture to a second setting.Type: GrantFiled: January 7, 2013Date of Patent: June 9, 2015Assignee: CAST Group of Companies Inc.Inventors: Gilray Densham, Justin Eichel, Kwok Wai William Law, Weibo Qin, Florentin Christoph von Frankenberg
-
Publication number: 20150120080Abstract: A system and method are provided for obtaining a 3D cue path and timing. In one example aspect, this path and timing may be manipulated in software. In another example aspect, one or more conditions may be specified which pertain to the path, timing, state of the path's environment, or state of one or more objects or actors in the path's environment. In another example aspect, these conditions may be accompanied by specifications for one or more actions to be taken if one or more of the conditions are or are not satisfied. In another example aspect, a person or object may be monitored as they follow the path, and prescribed actions may be taken if the specified conditions are or are not found to be satisfied.Type: ApplicationFiled: April 24, 2012Publication date: April 30, 2015Applicant: CAST Group of Companies Inc.Inventors: Gilray Densham, Justin Eichel
-
Patent number: 8938431Abstract: A configurable real-time environment tracking and command module (RTM) is provided to coordinate one or more than one devices or objects in a physical environment. A virtual environment is created to correlate with various objects and attributes within the physical environment. The RTM is able to receive data about attributes of physical objects and accordingly update the attributes of correlated virtual objects in the virtual environment. The RTM is also able to provide data extracted from the virtual environment to one or more than devices, such as robotic cameras, in real-time. An interface to the RTM allows multiple devices to interact with the RTM, thereby coordinating the devices.Type: GrantFiled: January 6, 2014Date of Patent: January 20, 2015Assignee: CAST Group of Companies Inc.Inventors: Gilray Densham, Justin Eichel
-
Publication number: 20140320667Abstract: Systems and methods are provided for tracking at least position and angular orientation. The system comprises a computing device in communication with at least two cameras, wherein each of the cameras are able to capture images of one or more light sources attached to an object. A receiver is in communication with the computing device, wherein the receiver is able to receive at least angular orientation data associated with the object. The computing device determines the object's position by comparing images of the light sources and generates an output comprising the position and angular orientation of the object.Type: ApplicationFiled: July 21, 2014Publication date: October 30, 2014Applicant: CAST GROUP OF COMPANIES INC.Inventors: Gilray DENSHAM, Justin EICHEL
-
Patent number: 8854594Abstract: Systems and methods are provided for tracking at least position and angular orientation. The system comprises a computing device in communication with at least two cameras, wherein each of the cameras are able to capture images of one or more light sources attached to an object. A receiver is in communication with the computing device, wherein the receiver is able to receive at least angular orientation data associated with the object. The computing device determines the object's position by comparing images of the light sources and generates an output comprising the position and angular orientation of the object.Type: GrantFiled: August 31, 2010Date of Patent: October 7, 2014Assignee: CAST Group of Companies Inc.Inventors: Gilray Densham, Justin Eichel
-
Publication number: 20140176537Abstract: A system and a method are provided for visualizing virtual objects on a mobile device. A computing device is in communication with the mobile device. The computing device generates a 3D virtual world of one or more virtual objects corresponding to one or more physical objects in a real world. The computing device then associates information with the one or more virtual objects and generates one or more static images based on the 3D virtual world. The mobile device receives the one or more static images and the associated information associated from the computing device, and then displays the one or more static images.Type: ApplicationFiled: February 28, 2014Publication date: June 26, 2014Applicant: CAST Group of Companies Inc.Inventor: Gilray DENSHAM
-
Publication number: 20140119606Abstract: A configurable real-time environment tracking and command module (RTM) is provided to coordinate one or more than one devices or objects in a physical environment. A virtual environment is created to correlate with various objects and attributes within the physical environment. The RTM is able to receive data about attributes of physical objects and accordingly update the attributes of correlated virtual objects in the virtual environment. The RTM is also able to provide data extracted from the virtual environment to one or more than devices, such as robotic cameras, in real-time. An interface to the RTM allows multiple devices to interact with the RTM, thereby coordinating the devices.Type: ApplicationFiled: January 6, 2014Publication date: May 1, 2014Applicant: CAST Group of Companies Inc.Inventors: Gilray DENSHAM, Justin EICHEL
-
Patent number: 8683387Abstract: A system and a method are provided for visualizing virtual objects on a mobile device. A computing device is in communication with the mobile device. The computing device generates a 3D virtual world of one or more virtual objects corresponding to one or more physical objects in a real world. The computing device then associates information with the one or more virtual objects and generates one or more static images based on the 3D virtual world. The mobile device receives the one or more static images and the associated information associated from the computing device, and then displays the one or more static images.Type: GrantFiled: March 3, 2011Date of Patent: March 25, 2014Assignee: CAST Group of Companies Inc.Inventor: Gilray Densham
-
Patent number: 8639666Abstract: A configurable real-time environment tracking and command module (RTM) is provided to coordinate one or more than one devices or objects in a physical environment. A virtual environment is created to correlate with various objects and attributes within the physical environment. The RTM is able to receive data about attributes of physical objects and accordingly update the attributes of correlated virtual objects in the virtual environment. The RTM is also able to provide data extracted from the virtual environment to one or more than devices, such as robotic cameras, in real-time. An interface to the RTM allows multiple devices to interact with the RTM, thereby coordinating the devices.Type: GrantFiled: April 9, 2009Date of Patent: January 28, 2014Assignee: Cast Group of Companies Inc.Inventors: Gilray Densham, Justin Eichel
-
Publication number: 20130307934Abstract: Systems and methods are provided for associating position information and sound. The method includes obtaining position information of an object at a given time; obtaining position information of a camera at the given time; determining a relative position of the object relative to the camera's position; and associating sound information with the relative position of the object. In another aspect, the position and orientation of a microphone are also tracked to calibrate the sound produced by an object or person, and the calibrated sound is associated with the relative position of the object, that is relative to the camera.Type: ApplicationFiled: January 31, 2012Publication date: November 21, 2013Applicant: CAST GROUP OF COMPANIES INC.Inventors: Gilray Densham, Justin Eichel
-
Publication number: 20130135303Abstract: Systems and methods are provided to allow a user to visualize a 3D model of a venue and to customize the 3D model of the venue according to their own needs. A data abstraction of the 3D venue model is created and sent to the venue operator. This data abstraction can be used to reconstruct the 3D venue model in a 3D virtual environment software. The customized 3D venue model is generated by: displaying on a web browser a 3D venue model; displaying one or more virtual objects available in an objects library; customizing the 3D venue model by receiving an input to place a selected virtual object in the 3D venue model; receiving an input to save the customized 3D venue model; and generating a text file comprising a name of the 3D venue model and data describing one or more characteristics of the selected virtual object.Type: ApplicationFiled: November 28, 2012Publication date: May 30, 2013Applicant: CAST Group of Companies Inc.Inventor: CAST Group of Companies Inc.
-
Publication number: 20130128054Abstract: Systems and methods are provided for using tracking data to control the functions of an automated fixture. Examples of automated fixtures include light fixtures and camera fixtures. A method includes obtaining a first position of a tracking unit. The tracking unit includes an inertial measurement unit and a visual indicator configured to be tracked by a camera. A first distance is computed between the automated fixture and the first position and it is used to set a function of the automated fixture to a first setting. A second position of the tracking unit is obtained. A second distance between the automated fixture and the second position is computed, and the second distance is used to set the function of the automated fixture to a second setting.Type: ApplicationFiled: January 7, 2013Publication date: May 23, 2013Applicant: CAST Group of Companies Inc.Inventor: CAST Group of Companies Inc.