Patents Assigned to GestureTek, Inc.
-
Publication number: 20110262032Abstract: Processing images includes projecting an infra-red pattern onto a three-dimensional object and producing a first image, a second image, and a third image of the three-dimensional object while the pattern is projected on the three-dimensional object. The first image and the second image include the three-dimensional object and the pattern. The first image and the second image are produced by capturing at a first camera and a second camera, respectively, light filtered through an infra-red filter. The third image includes the three-dimensional object but not the pattern. Processing the images also includes establishing a first-pair correspondence between a portion of pixels in the first image and a portion of pixels in the second image. Processing the images further includes constructing, based on the first-pair correspondence and the third image, a two-dimensional image that depicts a three-dimensional construction of the three-dimensional object.Type: ApplicationFiled: May 27, 2011Publication date: October 27, 2011Applicant: GESTURETEK, INC.Inventor: Jin Gu
-
Patent number: 7953271Abstract: Processing images includes projecting an infra-red pattern onto a three-dimensional object and producing a first image, a second image, and a third image of the three-dimensional object while the pattern is projected on the three-dimensional object. The first image and the second image include the three-dimensional object and the pattern. The first image and the second image are produced by capturing at a first camera and a second camera, respectively, light filtered through an infra-red filter. The third image includes the three-dimensional object but not the pattern. Processing the images also includes establishing a first-pair correspondence between a portion of pixels in the first image and a portion of pixels in the second image. Processing the images further includes constructing, based on the first-pair correspondence and the third image, a two-dimensional image that depicts a three-dimensional construction of the three-dimensional object.Type: GrantFiled: October 26, 2010Date of Patent: May 31, 2011Assignee: GestureTek, Inc.Inventor: Jin Gu
-
Publication number: 20110080490Abstract: Object tracking technology, in which controlling an illumination source is controlled to illuminate while a camera is capturing an image to define an intersection region within the image captured by the camera. The image captured by the camera is analyzed to detect an object within the intersection region. User input is determined based on the object detected within the intersection region and an application is controlled based on the determined user input.Type: ApplicationFiled: October 13, 2009Publication date: April 7, 2011Applicant: GESTURETEK, INC.Inventors: Ian Clarkson, Evan Hildreth
-
Publication number: 20110074974Abstract: A method is described for determining a description of motion of a moving mobile camera to determine a user input to an application. The method may involve capturing a series of images from a moving mobile camera and comparing stationary features present in the series of images. Optical flow analysis may be performed on the series of images to determine a description of motion of the moving mobile camera. Based on the determined motion, a user input to an application may be determined and the application may respond to the user input, for example, by updating a user interface of the application.Type: ApplicationFiled: December 6, 2010Publication date: March 31, 2011Applicant: GestureTek, Inc.Inventor: Evan Hildreth
-
Publication number: 20110050570Abstract: Orientation-sensitive signal output, in which a neutral position of a device is automatically determined in relation to at least a first axis, an angular displacement of the device is measured about at least the first axis, and shaking of the device is detected. A selection of the first control is received, and an output signal is output based at least upon the selection and the angular displacement or based upon detecting the shaking of the device.Type: ApplicationFiled: November 8, 2010Publication date: March 3, 2011Applicant: GESTURETEK, INC.Inventors: Riten Jaiswal, Francis MacDougall
-
Patent number: 7898522Abstract: A method of using stereo vision to interface with computer is provided. The method includes capturing a stereo image, and processing the stereo image to determine position information of an object in the stereo image. The object is controlled by a user. The method also includes communicating the position information to the computer to allow the user to interact with a computer application.Type: GrantFiled: June 1, 2007Date of Patent: March 1, 2011Assignee: GestureTek, Inc.Inventors: Evan Hildreth, Francis MacDougall
-
Publication number: 20110038530Abstract: Processing images includes projecting an infra-red pattern onto a three-dimensional object and producing a first image, a second image, and a third image of the three-dimensional object while the pattern is projected on the three-dimensional object. The first image and the second image include the three-dimensional object and the pattern. The first image and the second image are produced by capturing at a first camera and a second camera, respectively, light filtered through an infra-red filter. The third image includes the three-dimensional object but not the pattern. Processing the images also includes establishing a first-pair correspondence between a portion of pixels in the first image and a portion of pixels in the second image. Processing the images further includes constructing, based on the first-pair correspondence and the third image, a two-dimensional image that depicts a three-dimensional construction of the three-dimensional object.Type: ApplicationFiled: October 26, 2010Publication date: February 17, 2011Applicant: GESTURETEK, INC.Inventor: Jin Gu
-
Patent number: 7853041Abstract: According to one disclosed method, coordinates in a multi-dimensional space are determined for an image point characterizing a particular object. An equation describing a model in the multi-dimensional space is provided. The model is characteristic of a set of training images of one or more other objects. The coordinates are applied to the equation to determine a distance between the image point and the model. Based on the determined distance, a determination is made as to whether the particular object matches the one or more other objects. A set of training images may be received. A multi-dimensional space (e.g., eigenspace) may be determined based on the set of training images. A set of training points may be generated by projecting the set of training images into the multi-dimensional space. An equation describing a model in the multi-dimensional space that is characteristic of the set of training points may be determined.Type: GrantFiled: January 6, 2006Date of Patent: December 14, 2010Assignee: GestureTek, Inc.Inventor: Atid Shamaie
-
Patent number: 7848542Abstract: A method is described for determining a description of motion of a moving mobile camera to determine a user input to an application. The method may involve capturing a series of images from a moving mobile camera and comparing stationary features present in the series of images. Optical flow analysis may be performed on the series of images to determine a description of motion of the moving mobile camera. Based on the determined motion, a user input to an application may be determined and the application may respond to the user input, for example, by updating a user interface of the application.Type: GrantFiled: October 31, 2007Date of Patent: December 7, 2010Assignee: GestureTek, Inc.Inventor: Evan Hildreth
-
Patent number: 7827698Abstract: Orientation-sensitive signal output, in which a neutral position of a device is automatically determined in relation to at least a first axis, an angular displacement of the device is measured about at least the first axis, and shaking of the device is detected. A selection of the first control is received, and an output signal is output based at least upon the selection and the angular displacement or based upon detecting the shaking of the device.Type: GrantFiled: March 28, 2008Date of Patent: November 9, 2010Assignee: GestureTek, Inc.Inventors: Riten Jaiswal, Francis MacDougall
-
Patent number: 7822267Abstract: Processing images includes projecting an infra-red pattern onto a three-dimensional object and producing a first image, a second image, and a third image of the three-dimensional object while the pattern is projected on the three-dimensional object. The first image and the second image include the three-dimensional object and the pattern. The first image and the second image are produced by capturing at a first camera and a second camera, respectively, light filtered through an infra-red filter. The third image includes the three-dimensional object but not the pattern. Processing the images also includes establishing a first-pair correspondence between a portion of pixels in the first image and a portion of pixels in the second image. Processing the images further includes constructing, based on the first-pair correspondence and the third image, a two-dimensional image that depicts a three-dimensional construction of the three-dimensional object.Type: GrantFiled: June 24, 2008Date of Patent: October 26, 2010Assignee: GestureTek, Inc.Inventor: Jin Gu
-
Publication number: 20100259474Abstract: Enhanced handheld screen-sensing pointing, in which a handheld device captures a camera image of one or more fiducials rendered by a display device, and a position or an angle of the one or more fiducials in the captured camera image is determined. A position on the display device that the handheld device is aimed towards is determined based at least on the determined position or angle of the one or more fiducials in the camera image, and an application is controlled based on the determined position on the display device.Type: ApplicationFiled: April 8, 2010Publication date: October 14, 2010Applicant: GESTURETEK, INC.Inventor: Evan Hildreth
-
Patent number: 7777899Abstract: In emitting and extinguishing periods, emitting and extinguishing projected electromagnetic radiation is alternated, the emitted projected electromagnetic radiation defining a radiation region adjacent to a housing. Sensed electromagnetic radiation within respective fields of view of first and second sensors is detected, the sensed electromagnetic radiation including ambient electromagnetic radiation detected during the emitting and the extinguishing periods, and projected electromagnetic radiation reflected off an object in the radiation region during the emitting period. Indicia of a position of the object with respect to the housing is output.Type: GrantFiled: June 19, 2008Date of Patent: August 17, 2010Assignee: GestureTek, Inc.Inventor: Evan Hildreth
-
Publication number: 20100091110Abstract: A camera tracker, in which an image captured by a camera oriented to capture images across a surface is accessed. A region in which an object detected within the accessed image is positioned is determined from among multiple defined regions within a field of view of the camera. User input is determined based on the determined region and an application is controlled based on the determined user input.Type: ApplicationFiled: October 13, 2009Publication date: April 15, 2010Applicant: GESTURETEK, INC.Inventor: Evan Hildreth
-
Publication number: 20100066763Abstract: One or more elements are initially displayed on a display component of an electronic device. After the one or more elements have been displayed on the display component of the electronic device, an image of a user of the electronic device is captured, and an orientation of the electronic device relative to the user is determined based on the captured image of the user of the electronic device. Thereafter, an orientation of at least one of the displayed elements is adjusted relative to the display component of the electronic device based on the determined orientation of the electronic device relative to the user.Type: ApplicationFiled: September 14, 2009Publication date: March 18, 2010Applicant: GestureTek, Inc.Inventors: FRANCIS MACDOUGALL, EVAN HILDRETH
-
Publication number: 20100066667Abstract: An element is initially displayed on an interactive touch-screen display device with an initial orientation relative to the interactive touch-screen display device. One or more images of a user of the interactive touch-screen display device are captured. The user is determined to be interacting with the element displayed on the interactive touch-screen display device. In addition, an orientation of the user relative to the interactive touch-screen display device is determined based on at least one captured image of the user of the interactive touch-screen display device. Thereafter, in response to determining that the user is interacting with the displayed element, the initial orientation of the displayed element relative to the interactive touch-screen display device is automatically adjusted based on the determined orientation of the user relative to the interactive touch-screen display device.Type: ApplicationFiled: September 14, 2009Publication date: March 18, 2010Applicant: GESTURETEK, INC.Inventors: Francis MacDougall, Evan Hildreth
-
Publication number: 20100050134Abstract: The enhanced detection of a circular engagement gesture, in which a shape is defined within motion data, and the motion data is sampled at points that are aligned with the defined shape. It is determined whether a moving object is performing a gesture correlating to the defined shape based on a pattern exhibited by the sampled motion data. An application is controlled if determining that the moving object is performing the gesture.Type: ApplicationFiled: July 24, 2009Publication date: February 25, 2010Applicant: GESTURETEK, INC.Inventor: IAN CLARKSON
-
Publication number: 20100039379Abstract: Enhanced multi-touch detection, in which a graphical user interface for an application is projected onto a surface, and electromagnetic radiation is emitted. The electromagnetic radiation is collectively emitted by an array defining a layer aligned parallel with the surface and overlapping at least a region of the surface onto which the graphical user interface is projected. Electromagnetic radiation is detected that reflects off of an object interrupting the defined layer where the defined layer overlaps the region of the surface onto which the graphical user interface is projected, and indicating a position of the object is output.Type: ApplicationFiled: August 13, 2009Publication date: February 18, 2010Applicant: GestureTek Inc.Inventor: Evan Hildreth
-
Publication number: 20100040292Abstract: The enhanced detection of a waving engagement gesture, in which a shape is defined within motion data, the motion data is sampled at points that are aligned with the defined shape, and, based on the sampled motion data, positions of a moving object along the defined shape are determined over time. It is determined whether the moving object is performing a gesture based on a pattern exhibited by the determined positions, and an application is controlled if determining that the moving object is performing the gesture.Type: ApplicationFiled: July 24, 2009Publication date: February 18, 2010Applicant: GESTURETEK, INC.Inventor: IAN CLARKSON
-
Publication number: 20090315740Abstract: Enhanced character input using recognized gestures, in which a user's first and second gestures are recognized, and a control including radially disposed interaction elements is output. At least a portion of the interaction elements are associated with clusters of characters. When an interaction element is selected, the characters associated with the selected interaction element are disposed radially in relation to the selected interaction element. Using the control, the interaction element and a character associated with the selected interaction element are selected based on the user's recognized first and second gestures, respectively, and the selected character is output.Type: ApplicationFiled: June 23, 2008Publication date: December 24, 2009Applicant: GestureTek, Inc.Inventors: Evan Hildreth, Francis McDougall