Patents Assigned to GestureTek, Inc.
  • Publication number: 20110262032
    Abstract: Processing images includes projecting an infra-red pattern onto a three-dimensional object and producing a first image, a second image, and a third image of the three-dimensional object while the pattern is projected on the three-dimensional object. The first image and the second image include the three-dimensional object and the pattern. The first image and the second image are produced by capturing at a first camera and a second camera, respectively, light filtered through an infra-red filter. The third image includes the three-dimensional object but not the pattern. Processing the images also includes establishing a first-pair correspondence between a portion of pixels in the first image and a portion of pixels in the second image. Processing the images further includes constructing, based on the first-pair correspondence and the third image, a two-dimensional image that depicts a three-dimensional construction of the three-dimensional object.
    Type: Application
    Filed: May 27, 2011
    Publication date: October 27, 2011
    Applicant: GESTURETEK, INC.
    Inventor: Jin Gu
  • Patent number: 7953271
    Abstract: Processing images includes projecting an infra-red pattern onto a three-dimensional object and producing a first image, a second image, and a third image of the three-dimensional object while the pattern is projected on the three-dimensional object. The first image and the second image include the three-dimensional object and the pattern. The first image and the second image are produced by capturing at a first camera and a second camera, respectively, light filtered through an infra-red filter. The third image includes the three-dimensional object but not the pattern. Processing the images also includes establishing a first-pair correspondence between a portion of pixels in the first image and a portion of pixels in the second image. Processing the images further includes constructing, based on the first-pair correspondence and the third image, a two-dimensional image that depicts a three-dimensional construction of the three-dimensional object.
    Type: Grant
    Filed: October 26, 2010
    Date of Patent: May 31, 2011
    Assignee: GestureTek, Inc.
    Inventor: Jin Gu
  • Publication number: 20110080490
    Abstract: Object tracking technology, in which controlling an illumination source is controlled to illuminate while a camera is capturing an image to define an intersection region within the image captured by the camera. The image captured by the camera is analyzed to detect an object within the intersection region. User input is determined based on the object detected within the intersection region and an application is controlled based on the determined user input.
    Type: Application
    Filed: October 13, 2009
    Publication date: April 7, 2011
    Applicant: GESTURETEK, INC.
    Inventors: Ian Clarkson, Evan Hildreth
  • Publication number: 20110074974
    Abstract: A method is described for determining a description of motion of a moving mobile camera to determine a user input to an application. The method may involve capturing a series of images from a moving mobile camera and comparing stationary features present in the series of images. Optical flow analysis may be performed on the series of images to determine a description of motion of the moving mobile camera. Based on the determined motion, a user input to an application may be determined and the application may respond to the user input, for example, by updating a user interface of the application.
    Type: Application
    Filed: December 6, 2010
    Publication date: March 31, 2011
    Applicant: GestureTek, Inc.
    Inventor: Evan Hildreth
  • Publication number: 20110050570
    Abstract: Orientation-sensitive signal output, in which a neutral position of a device is automatically determined in relation to at least a first axis, an angular displacement of the device is measured about at least the first axis, and shaking of the device is detected. A selection of the first control is received, and an output signal is output based at least upon the selection and the angular displacement or based upon detecting the shaking of the device.
    Type: Application
    Filed: November 8, 2010
    Publication date: March 3, 2011
    Applicant: GESTURETEK, INC.
    Inventors: Riten Jaiswal, Francis MacDougall
  • Patent number: 7898522
    Abstract: A method of using stereo vision to interface with computer is provided. The method includes capturing a stereo image, and processing the stereo image to determine position information of an object in the stereo image. The object is controlled by a user. The method also includes communicating the position information to the computer to allow the user to interact with a computer application.
    Type: Grant
    Filed: June 1, 2007
    Date of Patent: March 1, 2011
    Assignee: GestureTek, Inc.
    Inventors: Evan Hildreth, Francis MacDougall
  • Publication number: 20110038530
    Abstract: Processing images includes projecting an infra-red pattern onto a three-dimensional object and producing a first image, a second image, and a third image of the three-dimensional object while the pattern is projected on the three-dimensional object. The first image and the second image include the three-dimensional object and the pattern. The first image and the second image are produced by capturing at a first camera and a second camera, respectively, light filtered through an infra-red filter. The third image includes the three-dimensional object but not the pattern. Processing the images also includes establishing a first-pair correspondence between a portion of pixels in the first image and a portion of pixels in the second image. Processing the images further includes constructing, based on the first-pair correspondence and the third image, a two-dimensional image that depicts a three-dimensional construction of the three-dimensional object.
    Type: Application
    Filed: October 26, 2010
    Publication date: February 17, 2011
    Applicant: GESTURETEK, INC.
    Inventor: Jin Gu
  • Patent number: 7853041
    Abstract: According to one disclosed method, coordinates in a multi-dimensional space are determined for an image point characterizing a particular object. An equation describing a model in the multi-dimensional space is provided. The model is characteristic of a set of training images of one or more other objects. The coordinates are applied to the equation to determine a distance between the image point and the model. Based on the determined distance, a determination is made as to whether the particular object matches the one or more other objects. A set of training images may be received. A multi-dimensional space (e.g., eigenspace) may be determined based on the set of training images. A set of training points may be generated by projecting the set of training images into the multi-dimensional space. An equation describing a model in the multi-dimensional space that is characteristic of the set of training points may be determined.
    Type: Grant
    Filed: January 6, 2006
    Date of Patent: December 14, 2010
    Assignee: GestureTek, Inc.
    Inventor: Atid Shamaie
  • Patent number: 7848542
    Abstract: A method is described for determining a description of motion of a moving mobile camera to determine a user input to an application. The method may involve capturing a series of images from a moving mobile camera and comparing stationary features present in the series of images. Optical flow analysis may be performed on the series of images to determine a description of motion of the moving mobile camera. Based on the determined motion, a user input to an application may be determined and the application may respond to the user input, for example, by updating a user interface of the application.
    Type: Grant
    Filed: October 31, 2007
    Date of Patent: December 7, 2010
    Assignee: GestureTek, Inc.
    Inventor: Evan Hildreth
  • Patent number: 7827698
    Abstract: Orientation-sensitive signal output, in which a neutral position of a device is automatically determined in relation to at least a first axis, an angular displacement of the device is measured about at least the first axis, and shaking of the device is detected. A selection of the first control is received, and an output signal is output based at least upon the selection and the angular displacement or based upon detecting the shaking of the device.
    Type: Grant
    Filed: March 28, 2008
    Date of Patent: November 9, 2010
    Assignee: GestureTek, Inc.
    Inventors: Riten Jaiswal, Francis MacDougall
  • Patent number: 7822267
    Abstract: Processing images includes projecting an infra-red pattern onto a three-dimensional object and producing a first image, a second image, and a third image of the three-dimensional object while the pattern is projected on the three-dimensional object. The first image and the second image include the three-dimensional object and the pattern. The first image and the second image are produced by capturing at a first camera and a second camera, respectively, light filtered through an infra-red filter. The third image includes the three-dimensional object but not the pattern. Processing the images also includes establishing a first-pair correspondence between a portion of pixels in the first image and a portion of pixels in the second image. Processing the images further includes constructing, based on the first-pair correspondence and the third image, a two-dimensional image that depicts a three-dimensional construction of the three-dimensional object.
    Type: Grant
    Filed: June 24, 2008
    Date of Patent: October 26, 2010
    Assignee: GestureTek, Inc.
    Inventor: Jin Gu
  • Publication number: 20100259474
    Abstract: Enhanced handheld screen-sensing pointing, in which a handheld device captures a camera image of one or more fiducials rendered by a display device, and a position or an angle of the one or more fiducials in the captured camera image is determined. A position on the display device that the handheld device is aimed towards is determined based at least on the determined position or angle of the one or more fiducials in the camera image, and an application is controlled based on the determined position on the display device.
    Type: Application
    Filed: April 8, 2010
    Publication date: October 14, 2010
    Applicant: GESTURETEK, INC.
    Inventor: Evan Hildreth
  • Patent number: 7777899
    Abstract: In emitting and extinguishing periods, emitting and extinguishing projected electromagnetic radiation is alternated, the emitted projected electromagnetic radiation defining a radiation region adjacent to a housing. Sensed electromagnetic radiation within respective fields of view of first and second sensors is detected, the sensed electromagnetic radiation including ambient electromagnetic radiation detected during the emitting and the extinguishing periods, and projected electromagnetic radiation reflected off an object in the radiation region during the emitting period. Indicia of a position of the object with respect to the housing is output.
    Type: Grant
    Filed: June 19, 2008
    Date of Patent: August 17, 2010
    Assignee: GestureTek, Inc.
    Inventor: Evan Hildreth
  • Publication number: 20100091110
    Abstract: A camera tracker, in which an image captured by a camera oriented to capture images across a surface is accessed. A region in which an object detected within the accessed image is positioned is determined from among multiple defined regions within a field of view of the camera. User input is determined based on the determined region and an application is controlled based on the determined user input.
    Type: Application
    Filed: October 13, 2009
    Publication date: April 15, 2010
    Applicant: GESTURETEK, INC.
    Inventor: Evan Hildreth
  • Publication number: 20100066667
    Abstract: An element is initially displayed on an interactive touch-screen display device with an initial orientation relative to the interactive touch-screen display device. One or more images of a user of the interactive touch-screen display device are captured. The user is determined to be interacting with the element displayed on the interactive touch-screen display device. In addition, an orientation of the user relative to the interactive touch-screen display device is determined based on at least one captured image of the user of the interactive touch-screen display device. Thereafter, in response to determining that the user is interacting with the displayed element, the initial orientation of the displayed element relative to the interactive touch-screen display device is automatically adjusted based on the determined orientation of the user relative to the interactive touch-screen display device.
    Type: Application
    Filed: September 14, 2009
    Publication date: March 18, 2010
    Applicant: GESTURETEK, INC.
    Inventors: Francis MacDougall, Evan Hildreth
  • Publication number: 20100066763
    Abstract: One or more elements are initially displayed on a display component of an electronic device. After the one or more elements have been displayed on the display component of the electronic device, an image of a user of the electronic device is captured, and an orientation of the electronic device relative to the user is determined based on the captured image of the user of the electronic device. Thereafter, an orientation of at least one of the displayed elements is adjusted relative to the display component of the electronic device based on the determined orientation of the electronic device relative to the user.
    Type: Application
    Filed: September 14, 2009
    Publication date: March 18, 2010
    Applicant: GestureTek, Inc.
    Inventors: FRANCIS MACDOUGALL, EVAN HILDRETH
  • Publication number: 20100050134
    Abstract: The enhanced detection of a circular engagement gesture, in which a shape is defined within motion data, and the motion data is sampled at points that are aligned with the defined shape. It is determined whether a moving object is performing a gesture correlating to the defined shape based on a pattern exhibited by the sampled motion data. An application is controlled if determining that the moving object is performing the gesture.
    Type: Application
    Filed: July 24, 2009
    Publication date: February 25, 2010
    Applicant: GESTURETEK, INC.
    Inventor: IAN CLARKSON
  • Publication number: 20100039379
    Abstract: Enhanced multi-touch detection, in which a graphical user interface for an application is projected onto a surface, and electromagnetic radiation is emitted. The electromagnetic radiation is collectively emitted by an array defining a layer aligned parallel with the surface and overlapping at least a region of the surface onto which the graphical user interface is projected. Electromagnetic radiation is detected that reflects off of an object interrupting the defined layer where the defined layer overlaps the region of the surface onto which the graphical user interface is projected, and indicating a position of the object is output.
    Type: Application
    Filed: August 13, 2009
    Publication date: February 18, 2010
    Applicant: GestureTek Inc.
    Inventor: Evan Hildreth
  • Publication number: 20100040292
    Abstract: The enhanced detection of a waving engagement gesture, in which a shape is defined within motion data, the motion data is sampled at points that are aligned with the defined shape, and, based on the sampled motion data, positions of a moving object along the defined shape are determined over time. It is determined whether the moving object is performing a gesture based on a pattern exhibited by the determined positions, and an application is controlled if determining that the moving object is performing the gesture.
    Type: Application
    Filed: July 24, 2009
    Publication date: February 18, 2010
    Applicant: GESTURETEK, INC.
    Inventor: IAN CLARKSON
  • Publication number: 20090315740
    Abstract: Enhanced character input using recognized gestures, in which a user's first and second gestures are recognized, and a control including radially disposed interaction elements is output. At least a portion of the interaction elements are associated with clusters of characters. When an interaction element is selected, the characters associated with the selected interaction element are disposed radially in relation to the selected interaction element. Using the control, the interaction element and a character associated with the selected interaction element are selected based on the user's recognized first and second gestures, respectively, and the selected character is output.
    Type: Application
    Filed: June 23, 2008
    Publication date: December 24, 2009
    Applicant: GestureTek, Inc.
    Inventors: Evan Hildreth, Francis McDougall