Patents Assigned to GestureTek, Inc.
  • Publication number: 20080208517
    Abstract: Enhanced single-sensor position detection, in which a position of an object is determined. In some implementations, a first signal is emitted from a first emitter, and a second signal is emitted from a second emitter. A plane is monitored using a sensor, and the first signal and the second signal are received at the sensor after each of the first signal and the second signal reflect off of the object. A response signal is generated based on the first and second signals, and the response signal is processed to determine the position of the object in the plane.
    Type: Application
    Filed: February 22, 2008
    Publication date: August 28, 2008
    Applicant: GESTURETEK, INC.
    Inventor: Atid Shamaie
  • Publication number: 20080199071
    Abstract: According to a general aspect, processing images includes projecting an infra-red pattern onto a three-dimensional object and producing a first image, a second image, and a third image of the three-dimensional object while the pattern is projected on the three-dimensional object. The first image and the second image include the three-dimensional object and the pattern. The first image and the second image are produced by capturing at a first camera and a second camera, respectively, light filtered through an infra-red filter. The third image includes the three-dimensional object but not the pattern. Processing the images also includes establishing a first-pair correspondence between a portion of pixels in the first image and a portion of pixels in the second image. Processing the images further includes constructing, based on the first-pair correspondence and the third image, a two-dimensional image that depicts a three-dimensional construction of the three-dimensional object.
    Type: Application
    Filed: April 23, 2008
    Publication date: August 21, 2008
    Applicant: GESTURETEK, INC.
    Inventor: Jin Gu
  • Publication number: 20080187178
    Abstract: According to one disclosed method, coordinates in a multi-dimensional space are determined for an image point characterizing a particular object. An equation describing a model in the space is provided. The model is characteristic of a set of training images of one or more other objects. The coordinates are applied to the equation to determine a distance between the image point and the model. Based on the determined distance, a determination is made as to whether the particular object matches the one or more other objects. A set of training images may be received. A multi-dimensional space (e.g., eigenspace) may be determined based on the set of training images. A set of training points may be generated by projecting the set of training images into the multi-dimensional space. An equation describing a model in the multi-dimensional space that is characteristic of the set of training points may be determined.
    Type: Application
    Filed: April 7, 2008
    Publication date: August 7, 2008
    Applicant: GestureTek, Inc.
    Inventor: Atid Shamaie
  • Publication number: 20080166022
    Abstract: The detection of motion of a user via a camera and the generation of a dynamic virtual representation of a user on a display, where the user's detected motion causes the dynamic virtual representation to interact with virtual objects on the display. The magnitude and direction of the user's detected motion is calculated to determine the magnitude and direction of a force applied by the dynamic virtual representation to the virtual object. Further arrangements include water or smoke fluid simulations, in order to enhance the user experience.
    Type: Application
    Filed: December 27, 2007
    Publication date: July 10, 2008
    Applicant: GESTURETEK, INC.
    Inventor: Evan Hildreth
  • Patent number: 7389591
    Abstract: The selection and output of a signal, such as an alphanumeric character, is provided depending upon the orientation of a device, such as a mobile telephone. In particular, a neutral position of a device is determined in relation to at least a first axis, the device including at least a first control associated with a first plurality of output signals, and an angular displacement of the device is measured about at least the first axis. A selection of the first control is also received, and one of the first plurality of output signals is output based at least upon the selection and the angular displacement.
    Type: Grant
    Filed: May 17, 2006
    Date of Patent: June 24, 2008
    Assignee: GestureTek, Inc.
    Inventors: Riten Jaiswal, Francis MacDougall
  • Patent number: 7379563
    Abstract: Hands may be tracked before, during, and after occlusion, and a gesture may be recognized. Movement of two occluded hands may be tracked as a unit during an occlusion period. A type of synchronization characterizing the two occluded hands during the occlusion period may be determined based on the tracked movement of the occluded hands. Based on the determined type of synchronization, it may be determined whether directions of travel for each of the two occluded hands change during the occlusion period. Implementations may determine that a first hand and a second hand are occluded during an occlusion period, the first hand having come from a first direction and the second hand having come from a second direction. The first hand may be distinguished from the second hand after the occlusion period based on a determined type of synchronization characterizing the two hands, and a behavior of the two hands.
    Type: Grant
    Filed: April 15, 2005
    Date of Patent: May 27, 2008
    Assignee: GestureTek, Inc.
    Inventor: Atid Shamaie
  • Patent number: 7379566
    Abstract: A method is described for determining a description of motion of a moving mobile camera to determine a user input to an application. The method may involve capturing a series of images from a moving mobile camera and comparing stationary features present in the series of images. Optical flow analysis may be performed on the series of images to determine a description of motion of the moving mobile camera. Based on the determined motion, a user input to an application may be determined and the application may respond to the user input, for example, by updating a user interface of the application.
    Type: Grant
    Filed: January 6, 2006
    Date of Patent: May 27, 2008
    Assignee: GestureTek, Inc.
    Inventor: Evan Hildreth
  • Publication number: 20080056536
    Abstract: A multiple camera tracking system for interfacing with an application program running on a computer is provided. The tracking system includes two or more video cameras arranged to provide different viewpoints of a region of interest, and are operable to produce a series of video images. A processor is operable to receive the series of video images and detect objects appearing in the region of interest. The processor executes a process to generate a background data set from the video images, generate an image data set for each received video image, compare each image data set to the background-data set to produce a difference map for each image data set, detect a relative position of an object of interest within each difference map, and produce an absolute position of the object of interest from the relative positions of the object of interest and map the absolute position to a position indicator associated with the application program.
    Type: Application
    Filed: October 31, 2007
    Publication date: March 6, 2008
    Applicant: GestureTek, Inc.
    Inventors: Evan Hildreth, Francis MacDougall
  • Publication number: 20080018595
    Abstract: A method of using stereo vision to interface with a computer is provided. The method includes capturing a stereo image, and processing the stereo image to determine position information of an object in the stereo image. The object is controlled by a user. The method also includes communicating the position information to the computer to allow the user to interact with a computer application.
    Type: Application
    Filed: August 17, 2007
    Publication date: January 24, 2008
    Applicant: GESTURETEK, INC.
    Inventors: Evan Hildreth, Francis MacDougall
  • Patent number: 7227526
    Abstract: A method of using stereo vision to interface with a computer is provided. The method includes capturing a stereo image, and processing the stereo image to determine position information of an object in the stereo image. The object is controlled by a user. The method also includes communicating the position information to the computer to allow the user to interact with a computer application.
    Type: Grant
    Filed: July 23, 2001
    Date of Patent: June 5, 2007
    Assignee: GestureTek, Inc.
    Inventors: Evan Hildreth, Francis MacDougall
  • Publication number: 20060281453
    Abstract: The selection and output of a signal, such as an alphanumeric character, is provided depending upon the orientation of a device, such as a mobile telephone. In particular, a neutral position of a device is determined in relation to at least a first axis, the device including at least a first control associated with a first plurality of output signals, and an angular displacement of the device is measured about at least the first axis. A selection of the first control is also received, and one of the first plurality of output signals is output based at least upon the selection and the angular displacement.
    Type: Application
    Filed: May 17, 2006
    Publication date: December 14, 2006
    Applicant: GESTURETEK, INC.
    Inventors: Riten Jaiswal, Francis MacDougall
  • Patent number: 7058204
    Abstract: A multiple camera tracking system for interfacing with an application program running on a computer is provided. The tracking system includes two or more video cameras arranged to provide different viewpoints of a region of interest, and are operable to produce a series of video images. A processor is operable to receive the series of video images and detect objects appearing in the region of interest. The processor executes a process to generate a background data set from the video images, generate an image data set for each received video image, compare each image data set to the background data set to produce a difference map for each image data set, detect a relative position of an object of interest within each difference map, and produce an absolute position of the object of interest from the relative positions of the object of interest and map the absolute position to a position indicator associated with the application program.
    Type: Grant
    Filed: September 26, 2001
    Date of Patent: June 6, 2006
    Assignee: GestureTek, Inc.
    Inventors: Evan Hildreth, Francis MacDougall