Patents by Inventor Evan Hildreth
Evan Hildreth has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20090031240Abstract: An enhanced control, in which a guide line is defined relative to an object in a user interface, items aligned with the guide line are displayed without obscuring the object. A selected item is output based on receiving a selection of one of the displayed items.Type: ApplicationFiled: April 14, 2008Publication date: January 29, 2009Applicant: GestureTek, Inc.Inventor: Evan Hildreth
-
Publication number: 20090027337Abstract: Enhanced camera-based input, in which a detection region surrounding a user is defined in an image of the user within a scene, and a position of an object (such as a hand) within the detection region is detected. Additionally, a control (such as a key of a virtual keyboard) in a user interface is interacted with based on the detected position of the object.Type: ApplicationFiled: May 21, 2008Publication date: January 29, 2009Applicant: GestureTek, Inc.Inventor: Evan Hildreth
-
Publication number: 20080273755Abstract: A camera is used to detect a position and/or orientation of an object such as a user's finger as an approach for providing user input, for example to scroll through data, control a cursor position, and provide input to control a video game based on a position of a user's finger. Input may be provided to a handheld device, including, for example, cell phones, video games systems, portable music (MP3) players, portable video players, personal data assistants (PDAs), audio/video equipment remote controls, and consumer digital cameras, or other types of devices.Type: ApplicationFiled: May 2, 2008Publication date: November 6, 2008Applicant: GestureTek, Inc.Inventor: Evan HILDRETH
-
Patent number: 7421093Abstract: A multiple camera tracking system for interfacing with an application program running on a computer is provided. The tracking system includes two or more video cameras arranged to provide different viewpoints of a region of interest, and are operable to produce a series of video images. A processor is operable to receive the series of video images and detect objects appearing in the region of interest. The processor executes a process to generate a background data set from the video images, generate an image data set for each received video image, compare each image data set to the background data set to produce a difference map for each image data set, detect a relative position of an object of interest within each difference map, and produce an absolute position of the object of interest from the relative positions of the object of interest and map the absolute position to a position indicator associated with the application program.Type: GrantFiled: December 19, 2005Date of Patent: September 2, 2008Assignee: GestureTek, Inc.Inventors: Evan Hildreth, Francis MacDougall
-
Publication number: 20080166022Abstract: The detection of motion of a user via a camera and the generation of a dynamic virtual representation of a user on a display, where the user's detected motion causes the dynamic virtual representation to interact with virtual objects on the display. The magnitude and direction of the user's detected motion is calculated to determine the magnitude and direction of a force applied by the dynamic virtual representation to the virtual object. Further arrangements include water or smoke fluid simulations, in order to enhance the user experience.Type: ApplicationFiled: December 27, 2007Publication date: July 10, 2008Applicant: GESTURETEK, INC.Inventor: Evan Hildreth
-
Publication number: 20080137913Abstract: A method is described for determining a description of motion of a moving mobile camera to determine a user input to an application. The method may involve capturing a series of images from a moving mobile camera and comparing stationary features present in the series of images. Optical flow analysis may be performed on the series of images to determine a description of motion of the moving mobile camera. Based on the determined motion, a user input to an application may be determined and the application may respond to the user input, for example, by updating a user interface of the application.Type: ApplicationFiled: October 31, 2007Publication date: June 12, 2008Applicant: Gesture Tek, Inc.Inventor: Evan Hildreth
-
Patent number: 7379566Abstract: A method is described for determining a description of motion of a moving mobile camera to determine a user input to an application. The method may involve capturing a series of images from a moving mobile camera and comparing stationary features present in the series of images. Optical flow analysis may be performed on the series of images to determine a description of motion of the moving mobile camera. Based on the determined motion, a user input to an application may be determined and the application may respond to the user input, for example, by updating a user interface of the application.Type: GrantFiled: January 6, 2006Date of Patent: May 27, 2008Assignee: GestureTek, Inc.Inventor: Evan Hildreth
-
Publication number: 20080056536Abstract: A multiple camera tracking system for interfacing with an application program running on a computer is provided. The tracking system includes two or more video cameras arranged to provide different viewpoints of a region of interest, and are operable to produce a series of video images. A processor is operable to receive the series of video images and detect objects appearing in the region of interest. The processor executes a process to generate a background data set from the video images, generate an image data set for each received video image, compare each image data set to the background-data set to produce a difference map for each image data set, detect a relative position of an object of interest within each difference map, and produce an absolute position of the object of interest from the relative positions of the object of interest and map the absolute position to a position indicator associated with the application program.Type: ApplicationFiled: October 31, 2007Publication date: March 6, 2008Applicant: GestureTek, Inc.Inventors: Evan Hildreth, Francis MacDougall
-
Publication number: 20080030460Abstract: A method of using stereo vision to interface with computer is provided. The method includes capturing a stereo image, and processing the stereo image to determine position information of an object in the stereo image. The object is controlled by a user. The method also includes communicating the position information to the computer to allow the user to interact with a computer application.Type: ApplicationFiled: June 1, 2007Publication date: February 7, 2008Inventors: Evan Hildreth, Francis MacDougall
-
Publication number: 20080018595Abstract: A method of using stereo vision to interface with a computer is provided. The method includes capturing a stereo image, and processing the stereo image to determine position information of an object in the stereo image. The object is controlled by a user. The method also includes communicating the position information to the computer to allow the user to interact with a computer application.Type: ApplicationFiled: August 17, 2007Publication date: January 24, 2008Applicant: GESTURETEK, INC.Inventors: Evan Hildreth, Francis MacDougall
-
Patent number: 7227526Abstract: A method of using stereo vision to interface with a computer is provided. The method includes capturing a stereo image, and processing the stereo image to determine position information of an object in the stereo image. The object is controlled by a user. The method also includes communicating the position information to the computer to allow the user to interact with a computer application.Type: GrantFiled: July 23, 2001Date of Patent: June 5, 2007Assignee: GestureTek, Inc.Inventors: Evan Hildreth, Francis MacDougall
-
Publication number: 20060192782Abstract: In one implementation, a first captured image is accessed. The first captured image includes (1) a first display produced at a first point in time, and (2) a user interacting with the first display and not part of the first display. A second captured image is accessed. The second captured image includes (1) a second display produced at a second point in time, and (2) the user interacting with the second display and not part of the second display. The first captured image and the second captured image are compared. The motion of the user is determined based on a result of the comparing of the first captured image and the second captured image. The determined motion of the user is related to a portion of one or more of the first and second captured images.Type: ApplicationFiled: January 23, 2006Publication date: August 31, 2006Inventor: Evan Hildreth
-
Publication number: 20060177103Abstract: A method is described for determining a description of motion of a moving mobile camera to determine a user input to an application. The method may involve capturing a series of images from a moving mobile camera and comparing stationary features present in the series of images. Optical flow analysis may be performed on the series of images to determine a description of motion of the moving mobile camera. Based on the determined motion, a user input to an application may be determined and the application may respond to the user input, for example, by updating a user interface of the application.Type: ApplicationFiled: January 6, 2006Publication date: August 10, 2006Inventor: Evan Hildreth
-
Patent number: 7058204Abstract: A multiple camera tracking system for interfacing with an application program running on a computer is provided. The tracking system includes two or more video cameras arranged to provide different viewpoints of a region of interest, and are operable to produce a series of video images. A processor is operable to receive the series of video images and detect objects appearing in the region of interest. The processor executes a process to generate a background data set from the video images, generate an image data set for each received video image, compare each image data set to the background data set to produce a difference map for each image data set, detect a relative position of an object of interest within each difference map, and produce an absolute position of the object of interest from the relative positions of the object of interest and map the absolute position to a position indicator associated with the application program.Type: GrantFiled: September 26, 2001Date of Patent: June 6, 2006Assignee: GestureTek, Inc.Inventors: Evan Hildreth, Francis MacDougall
-
Publication number: 20060098873Abstract: A multiple camera tracking system for interfacing with an application program running on a computer is provided. The tracking system includes two or more video cameras arranged to provide different viewpoints of a region of interest, and are operable to produce a series of video images. A processor is operable to receive the series of video images and detect objects appearing in the region of interest. The processor executes a process to generate a background data set from the video images, generate an image data set for each received video image, compare each image data set to the background data set to produce a difference map for each image data set, detect a relative position of an object of interest within each difference map, and produce an absolute position of the object of interest from the relative positions of the object of interest and map the absolute position to a position indicator associated with the application program.Type: ApplicationFiled: December 19, 2005Publication date: May 11, 2006Inventors: Evan Hildreth, Francis MacDougall
-
Publication number: 20020064382Abstract: A multiple camera tracking system for interfacing with an application program running on a computer is provided. The tracking system includes two or more video cameras arranged to provide different viewpoints of a region of interest, and are operable to produce a series of video images. A processor is operable to receive the series of video images and detect objects appearing in the region of interest. The processor executes a process to generate a background data set from the video images, generate an image data set for each received video image, compare each image data set to the background data set to produce a difference map for each image data set, detect a relative position of an object of interest within each difference map, and produce an absolute position of the object of interest from the relative positions of the object of interest and map the absolute position to a position indicator associated with the application program.Type: ApplicationFiled: September 26, 2001Publication date: May 30, 2002Inventors: Evan Hildreth, Francis MacDougall
-
Publication number: 20020041327Abstract: A method of using stereo vision to interface with a computer is provided. The method includes capturing a stereo image, and processing the stereo image to determine position information of an object in the stereo image. The object is controlled by a user. The method also includes communicating the position information to the computer to allow the user to interact with a computer application.Type: ApplicationFiled: July 23, 2001Publication date: April 11, 2002Inventors: Evan Hildreth, Francis MacDougall