Patents Assigned to GestureTek, Inc.
-
Publication number: 20080208517Abstract: Enhanced single-sensor position detection, in which a position of an object is determined. In some implementations, a first signal is emitted from a first emitter, and a second signal is emitted from a second emitter. A plane is monitored using a sensor, and the first signal and the second signal are received at the sensor after each of the first signal and the second signal reflect off of the object. A response signal is generated based on the first and second signals, and the response signal is processed to determine the position of the object in the plane.Type: ApplicationFiled: February 22, 2008Publication date: August 28, 2008Applicant: GESTURETEK, INC.Inventor: Atid Shamaie
-
Publication number: 20080199071Abstract: According to a general aspect, processing images includes projecting an infra-red pattern onto a three-dimensional object and producing a first image, a second image, and a third image of the three-dimensional object while the pattern is projected on the three-dimensional object. The first image and the second image include the three-dimensional object and the pattern. The first image and the second image are produced by capturing at a first camera and a second camera, respectively, light filtered through an infra-red filter. The third image includes the three-dimensional object but not the pattern. Processing the images also includes establishing a first-pair correspondence between a portion of pixels in the first image and a portion of pixels in the second image. Processing the images further includes constructing, based on the first-pair correspondence and the third image, a two-dimensional image that depicts a three-dimensional construction of the three-dimensional object.Type: ApplicationFiled: April 23, 2008Publication date: August 21, 2008Applicant: GESTURETEK, INC.Inventor: Jin Gu
-
Publication number: 20080187178Abstract: According to one disclosed method, coordinates in a multi-dimensional space are determined for an image point characterizing a particular object. An equation describing a model in the space is provided. The model is characteristic of a set of training images of one or more other objects. The coordinates are applied to the equation to determine a distance between the image point and the model. Based on the determined distance, a determination is made as to whether the particular object matches the one or more other objects. A set of training images may be received. A multi-dimensional space (e.g., eigenspace) may be determined based on the set of training images. A set of training points may be generated by projecting the set of training images into the multi-dimensional space. An equation describing a model in the multi-dimensional space that is characteristic of the set of training points may be determined.Type: ApplicationFiled: April 7, 2008Publication date: August 7, 2008Applicant: GestureTek, Inc.Inventor: Atid Shamaie
-
Publication number: 20080166022Abstract: The detection of motion of a user via a camera and the generation of a dynamic virtual representation of a user on a display, where the user's detected motion causes the dynamic virtual representation to interact with virtual objects on the display. The magnitude and direction of the user's detected motion is calculated to determine the magnitude and direction of a force applied by the dynamic virtual representation to the virtual object. Further arrangements include water or smoke fluid simulations, in order to enhance the user experience.Type: ApplicationFiled: December 27, 2007Publication date: July 10, 2008Applicant: GESTURETEK, INC.Inventor: Evan Hildreth
-
Patent number: 7389591Abstract: The selection and output of a signal, such as an alphanumeric character, is provided depending upon the orientation of a device, such as a mobile telephone. In particular, a neutral position of a device is determined in relation to at least a first axis, the device including at least a first control associated with a first plurality of output signals, and an angular displacement of the device is measured about at least the first axis. A selection of the first control is also received, and one of the first plurality of output signals is output based at least upon the selection and the angular displacement.Type: GrantFiled: May 17, 2006Date of Patent: June 24, 2008Assignee: GestureTek, Inc.Inventors: Riten Jaiswal, Francis MacDougall
-
Patent number: 7379563Abstract: Hands may be tracked before, during, and after occlusion, and a gesture may be recognized. Movement of two occluded hands may be tracked as a unit during an occlusion period. A type of synchronization characterizing the two occluded hands during the occlusion period may be determined based on the tracked movement of the occluded hands. Based on the determined type of synchronization, it may be determined whether directions of travel for each of the two occluded hands change during the occlusion period. Implementations may determine that a first hand and a second hand are occluded during an occlusion period, the first hand having come from a first direction and the second hand having come from a second direction. The first hand may be distinguished from the second hand after the occlusion period based on a determined type of synchronization characterizing the two hands, and a behavior of the two hands.Type: GrantFiled: April 15, 2005Date of Patent: May 27, 2008Assignee: GestureTek, Inc.Inventor: Atid Shamaie
-
Patent number: 7379566Abstract: A method is described for determining a description of motion of a moving mobile camera to determine a user input to an application. The method may involve capturing a series of images from a moving mobile camera and comparing stationary features present in the series of images. Optical flow analysis may be performed on the series of images to determine a description of motion of the moving mobile camera. Based on the determined motion, a user input to an application may be determined and the application may respond to the user input, for example, by updating a user interface of the application.Type: GrantFiled: January 6, 2006Date of Patent: May 27, 2008Assignee: GestureTek, Inc.Inventor: Evan Hildreth
-
Publication number: 20080056536Abstract: A multiple camera tracking system for interfacing with an application program running on a computer is provided. The tracking system includes two or more video cameras arranged to provide different viewpoints of a region of interest, and are operable to produce a series of video images. A processor is operable to receive the series of video images and detect objects appearing in the region of interest. The processor executes a process to generate a background data set from the video images, generate an image data set for each received video image, compare each image data set to the background-data set to produce a difference map for each image data set, detect a relative position of an object of interest within each difference map, and produce an absolute position of the object of interest from the relative positions of the object of interest and map the absolute position to a position indicator associated with the application program.Type: ApplicationFiled: October 31, 2007Publication date: March 6, 2008Applicant: GestureTek, Inc.Inventors: Evan Hildreth, Francis MacDougall
-
Publication number: 20080018595Abstract: A method of using stereo vision to interface with a computer is provided. The method includes capturing a stereo image, and processing the stereo image to determine position information of an object in the stereo image. The object is controlled by a user. The method also includes communicating the position information to the computer to allow the user to interact with a computer application.Type: ApplicationFiled: August 17, 2007Publication date: January 24, 2008Applicant: GESTURETEK, INC.Inventors: Evan Hildreth, Francis MacDougall
-
Patent number: 7227526Abstract: A method of using stereo vision to interface with a computer is provided. The method includes capturing a stereo image, and processing the stereo image to determine position information of an object in the stereo image. The object is controlled by a user. The method also includes communicating the position information to the computer to allow the user to interact with a computer application.Type: GrantFiled: July 23, 2001Date of Patent: June 5, 2007Assignee: GestureTek, Inc.Inventors: Evan Hildreth, Francis MacDougall
-
Publication number: 20060281453Abstract: The selection and output of a signal, such as an alphanumeric character, is provided depending upon the orientation of a device, such as a mobile telephone. In particular, a neutral position of a device is determined in relation to at least a first axis, the device including at least a first control associated with a first plurality of output signals, and an angular displacement of the device is measured about at least the first axis. A selection of the first control is also received, and one of the first plurality of output signals is output based at least upon the selection and the angular displacement.Type: ApplicationFiled: May 17, 2006Publication date: December 14, 2006Applicant: GESTURETEK, INC.Inventors: Riten Jaiswal, Francis MacDougall
-
Patent number: 7058204Abstract: A multiple camera tracking system for interfacing with an application program running on a computer is provided. The tracking system includes two or more video cameras arranged to provide different viewpoints of a region of interest, and are operable to produce a series of video images. A processor is operable to receive the series of video images and detect objects appearing in the region of interest. The processor executes a process to generate a background data set from the video images, generate an image data set for each received video image, compare each image data set to the background data set to produce a difference map for each image data set, detect a relative position of an object of interest within each difference map, and produce an absolute position of the object of interest from the relative positions of the object of interest and map the absolute position to a position indicator associated with the application program.Type: GrantFiled: September 26, 2001Date of Patent: June 6, 2006Assignee: GestureTek, Inc.Inventors: Evan Hildreth, Francis MacDougall