Patents Assigned to GestureTek, Inc.
  • Publication number: 20090295756
    Abstract: According to one disclosed method, coordinates in a multi-dimensional space are determined for an image point characterizing a particular object. An equation describing a model in the space is provided. The model is characteristic of a set of training images of one or more other objects. The coordinates are applied to the equation to determine a distance between the image point and the model. Based on the determined distance, a determination is made as to whether the particular object matches the one or more other objects. A set of training images may be received. A multi-dimensional space (e.g., eigenspace) may be determined based on the set of training images. A set of training points may be generated by projecting the set of training images into the multi-dimensional space. An equation describing a model in the multi-dimensional space that is characteristic of the set of training points may be determined.
    Type: Application
    Filed: August 10, 2009
    Publication date: December 3, 2009
    Applicant: GestureTek, Inc.
    Inventor: Atid Shamaie
  • Publication number: 20090217211
    Abstract: Enhanced input using recognized gestures, in which a user's gesture is recognized from first and second images, and a representation of the user is displayed in a central region of a control that further includes interaction elements disposed radially in relation to the central region. The enhanced input also includes interacting with the control based on the recognized user's gesture, and controlling an application based on interacting with the control.
    Type: Application
    Filed: February 27, 2008
    Publication date: August 27, 2009
    Applicant: GestureTek, Inc.
    Inventors: Evan HILDRETH, Francis MacDougall
  • Patent number: 7574020
    Abstract: According to one disclosed method, coordinates in a multi-dimensional space are determined for an image point characterizing a particular object. An equation describing a model in the space is provided. The model is characteristic of a set of training images of one or more other objects. The coordinates are applied to the equation to determine a distance between the image point and the model. Based on the determined distance, a determination is made as to whether the particular object matches the one or more other objects. A set of training images may be received. A multi-dimensional space (e.g., eigenspace) may be determined based on the set of training images. A set of training points may be generated by projecting the set of training images into the multi-dimensional space. An equation describing a model in the multi-dimensional space that is characteristic of the set of training points may be determined.
    Type: Grant
    Filed: April 7, 2008
    Date of Patent: August 11, 2009
    Assignee: GestureTek, Inc.
    Inventor: Atid Shamaie
  • Patent number: 7570805
    Abstract: According to a general aspect, processing images includes projecting an infra-red pattern onto a three-dimensional object and producing a first image, a second image, and a third image of the three-dimensional object while the pattern is projected on the three-dimensional object. The first image and the second image include the three-dimensional object and the pattern. The first image and the second image are produced by capturing at a first camera and a second camera, respectively, light filtered through an infra-red filter. The third image includes the three-dimensional object but not the pattern. Processing the images also includes establishing a first-pair correspondence between a portion of pixels in the first image and a portion of pixels in the second image. Processing the images further includes constructing, based on the first-pair correspondence and the third image, a two-dimensional image that depicts a three-dimensional construction of the three-dimensional object.
    Type: Grant
    Filed: April 23, 2008
    Date of Patent: August 4, 2009
    Assignee: GestureTek, Inc.
    Inventor: Jin Gu
  • Patent number: 7555142
    Abstract: A multiple camera tracking system for interfacing with an application program running on a computer is provided. The tracking system includes two or more video cameras arranged to provide different viewpoints of a region of interest, and are operable to produce a series of video images. A processor is operable to receive the series of video images and detect objects appearing in the region of interest. The processor executes a process to generate a background data set from the video images, generate an image data set for each received video image, compare each image data set to the background-data set to produce a difference map for each image data set, detect a relative position of an object of interest within each difference map, and produce an absolute position of the object of interest from the relative positions of the object of interest and map the absolute position to a position indicator associated with the application program.
    Type: Grant
    Filed: October 31, 2007
    Date of Patent: June 30, 2009
    Assignee: GestureTek, Inc.
    Inventors: Evan Hildreth, Francis MacDougall
  • Publication number: 20090138805
    Abstract: An electronic media device may be controlled based on personalized media preferences of users experiencing content using the electronic media device. Users experiencing content using the electronic media device may be automatically identified and the electronic media device may be automatically controlled based on media preferences associated with the identified users.
    Type: Application
    Filed: November 21, 2008
    Publication date: May 28, 2009
    Applicant: GestureTek, Inc.
    Inventor: Evan Hildreth
  • Publication number: 20090133051
    Abstract: Access to an electronic media device may be controlled based on media settings of users experiencing content using the electronic media device. Users or attributes of the users experiencing content using the electronic media device may be automatically identified and access to the electronic media device may be automatically controlled based on media settings associated with the identified users and/or attributes.
    Type: Application
    Filed: November 21, 2008
    Publication date: May 21, 2009
    Applicant: GESTURETEK, INC.
    Inventor: Evan Hildreth
  • Publication number: 20090079813
    Abstract: An enhanced interface for voice and video communications, in which a gesture of a user is recognized from a sequence of camera images, and a user interface is provided include a control and a representation of the user. The process also includes causing the representation to interact with the control based on the recognized gesture, and controlling a telecommunication session based on the interaction.
    Type: Application
    Filed: September 23, 2008
    Publication date: March 26, 2009
    Applicant: GestureTek, Inc.
    Inventor: Evan Hildreth
  • Publication number: 20090052785
    Abstract: Enhanced rejection of out-of-vocabulary words, in which, based on applying an input gesture to hidden Markov models collectively modeling a vocabulary of training gestures, a likelihood that the input gesture matches each training gesture, and a quantity of states of the input gesture that match corresponding states of a modeled training gesture determined to have a highest likelihood are determined. The input gesture is rejected if the determined quantity does not satisfy a threshold.
    Type: Application
    Filed: August 20, 2008
    Publication date: February 26, 2009
    Applicant: GestureTek, Inc.
    Inventor: Atid Shamaie
  • Publication number: 20090051648
    Abstract: Gesture-based mobile interaction, motion of a device is sensed using image data, and a gesture corresponding to the sensed motion of the device is recognized. Functionality of the device corresponding to the recognized gesture is determined and invoked.
    Type: Application
    Filed: August 20, 2008
    Publication date: February 26, 2009
    Applicant: GestureTek, Inc.
    Inventors: ATID SHAMAIE, FRANCIS MACDOUGALL
  • Publication number: 20090031240
    Abstract: An enhanced control, in which a guide line is defined relative to an object in a user interface, items aligned with the guide line are displayed without obscuring the object. A selected item is output based on receiving a selection of one of the displayed items.
    Type: Application
    Filed: April 14, 2008
    Publication date: January 29, 2009
    Applicant: GestureTek, Inc.
    Inventor: Evan Hildreth
  • Publication number: 20090027337
    Abstract: Enhanced camera-based input, in which a detection region surrounding a user is defined in an image of the user within a scene, and a position of an object (such as a hand) within the detection region is detected. Additionally, a control (such as a key of a virtual keyboard) in a user interface is interacted with based on the detected position of the object.
    Type: Application
    Filed: May 21, 2008
    Publication date: January 29, 2009
    Applicant: GestureTek, Inc.
    Inventor: Evan Hildreth
  • Publication number: 20090003686
    Abstract: Processing images includes projecting an infra-red pattern onto a three-dimensional object and producing a first image, a second image, and a third image of the three-dimensional object while the pattern is projected on the three-dimensional object. The first image and the second image include the three-dimensional object and the pattern. The first image and the second image are produced by capturing at a first camera and a second camera, respectively, light filtered through an infra-red filter. The third image includes the three-dimensional object but not the pattern. Processing the images also includes establishing a first-pair correspondence between a portion of pixels in the first image and a portion of pixels in the second image. Processing the images further includes constructing, based on the first-pair correspondence and the third image, a two-dimensional image that depicts a three-dimensional construction of the three-dimensional object.
    Type: Application
    Filed: June 24, 2008
    Publication date: January 1, 2009
    Applicant: GestureTek, Inc.
    Inventor: Jin Gu
  • Publication number: 20080273755
    Abstract: A camera is used to detect a position and/or orientation of an object such as a user's finger as an approach for providing user input, for example to scroll through data, control a cursor position, and provide input to control a video game based on a position of a user's finger. Input may be provided to a handheld device, including, for example, cell phones, video games systems, portable music (MP3) players, portable video players, personal data assistants (PDAs), audio/video equipment remote controls, and consumer digital cameras, or other types of devices.
    Type: Application
    Filed: May 2, 2008
    Publication date: November 6, 2008
    Applicant: GestureTek, Inc.
    Inventor: Evan HILDRETH
  • Publication number: 20080267447
    Abstract: Mobile video-based therapy, using a portable therapy device that includes a camera, a therapy application database, a processor, and a display. The camera is configured to generate images of a user, and the therapy application database is configured to store therapy applications. The processor is configured to select, from the therapy application database, a therapy application appropriate for assisting in physical or cognitive rehabilitation or therapy of the user, to invoke the therapy application, to recognize a gesture of the user from the generated images, and to control the invoked therapy application based on the recognized gesture. The display is configured to display an output of the controlled therapy application.
    Type: Application
    Filed: April 30, 2008
    Publication date: October 30, 2008
    Applicant: GestureTek, Inc.
    Inventors: Ronald L. Kelusky, Scott Robinson
  • Publication number: 20080235965
    Abstract: Orientation-sensitive signal output, in which a neutral position of a device is automatically determined in relation to at least a first axis, an angular displacement of the device is measured about at least the first axis, and shaking of the device is detected. A selection of the first control is received, and an output signal is output based at least upon the selection and the angular displacement or based upon detecting the shaking of the device.
    Type: Application
    Filed: March 28, 2008
    Publication date: October 2, 2008
    Applicant: GestureTek, Inc.
    Inventors: Riten JAISWAL, Francis MacDougall
  • Patent number: 7430312
    Abstract: According to a general aspect, processing images includes projecting an infra-red pattern onto a three-dimensional object and producing a first image, a second image, and a third image of the three-dimensional object while the pattern is projected on the three-dimensional object. The first image and the second image include the three-dimensional object and the pattern. The first image and the second image are produced by capturing at a first camera and a second camera, respectively, light filtered through an infra-red filter. The third image includes the three-dimensional object but not the pattern. Processing the images also includes establishing a first-pair correspondence between a portion of pixels in the first image and a portion of pixels in the second image. Processing the images further includes constructing, based on the first-pair correspondence and the third image, a two-dimensional image that depicts a three-dimensional construction of the three-dimensional object.
    Type: Grant
    Filed: January 9, 2006
    Date of Patent: September 30, 2008
    Assignee: GestureTek, Inc.
    Inventor: Jin Gu
  • Publication number: 20080219502
    Abstract: Hands may be tracked before, during, and after occlusion, and a gesture may be recognized. Movement of two occluded hands may be tracked as a unit during an occlusion period. A type of synchronization characterizing the two occluded hands during the occlusion period may be determined based on the tracked movement of the occluded hands. Based on the determined type of synchronization, it may be determined whether directions of travel for each of the two occluded hands change during the occlusion period. Implementations may determine that a first hand and a second hand are occluded during an occlusion period, the first hand having come from a first direction and the second hand having come from a second direction. The first hand may be distinguished from the second hand after the occlusion period based on a determined type of synchronization characterizing the two hands, and a behavior of the two hands.
    Type: Application
    Filed: October 31, 2007
    Publication date: September 11, 2008
    Applicant: GESTURETEK, INC.
    Inventor: Atid Shamaie
  • Patent number: 7421093
    Abstract: A multiple camera tracking system for interfacing with an application program running on a computer is provided. The tracking system includes two or more video cameras arranged to provide different viewpoints of a region of interest, and are operable to produce a series of video images. A processor is operable to receive the series of video images and detect objects appearing in the region of interest. The processor executes a process to generate a background data set from the video images, generate an image data set for each received video image, compare each image data set to the background data set to produce a difference map for each image data set, detect a relative position of an object of interest within each difference map, and produce an absolute position of the object of interest from the relative positions of the object of interest and map the absolute position to a position indicator associated with the application program.
    Type: Grant
    Filed: December 19, 2005
    Date of Patent: September 2, 2008
    Assignee: GestureTek, Inc.
    Inventors: Evan Hildreth, Francis MacDougall
  • Publication number: 20080205701
    Abstract: Enhanced input using flashing electromagnetic radiation, in which first and second images, captured on a first side of a screen, of an object and an ambient electromagnetic radiation emitter disposed on a second side of the screen, are accessed. The first image being captured while the object is illuminated with projected electromagnetic radiation, and the second image being captured while the projected electromagnetic radiation is extinguished. A position of the object relative to the screen based on coniparing the first and second images is determined. An application is controlled based on the determined position.
    Type: Application
    Filed: February 15, 2008
    Publication date: August 28, 2008
    Applicant: GESTURETEK, INC.
    Inventors: Atid Shamaie, Francis MacDougall