Patents by Inventor Thad Eugene Starner

Thad Eugene Starner has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20160161240
    Abstract: Methods and systems are described that involve a wearable computing device or an associated device determining the orientation of a person's head relative to their body. To do so, example methods and systems may compare sensor data from the wearable computing device to corresponding sensor data from a tracking device that is expected to move in a manner that follows the wearer's body, such a mobile phone that is located in the wearable computing device's wearer's pocket.
    Type: Application
    Filed: February 15, 2016
    Publication date: June 9, 2016
    Inventors: Thad Eugene Starner, Michael Patrick Johnson
  • Patent number: 9354445
    Abstract: Example methods and systems determine a position of a portion of a human eye based on electromagnetic radiation reflected from the surface of the human eye. A sensor associated with a computing device can be calibrated in response to an event. The computing device can receive data indicative of electromagnetic radiation reflected from a human eye. The computing device can determine a position of a portion of the human eye based on the received data indicative of electromagnetic radiation. The computing device can generate an indication including the position of the portion of the human eye. The computing device can transmit the indication from the computing device. In some embodiments, the data indicative of electromagnetic information can be provided by electromagnetic emitter/sensors mounted on a wearable computing device directed toward a human eye of a wearer of the wearable computing device.
    Type: Grant
    Filed: August 31, 2012
    Date of Patent: May 31, 2016
    Assignee: Google Inc.
    Inventors: Joshua Weaver, Thad Eugene Starner, Cliff L. Biffle, Edward Allen Keyes
  • Publication number: 20160110883
    Abstract: Exemplary embodiments may involve analyzing reflections from an eye to help determine where the respective sources of the reflections are located. An exemplary method involves: (a) analyzing eye-image data to determine observed movement of a reflected feature on an eye surface; (b) determining an expected movement of the reflected feature on the eye surface given a value of a z-distance parameter; (c) determining a difference between the observed movement of the reflected feature on the eye surface and the expected movement of the reflected feature on the eye surface; (d) if the difference is less than a threshold, then associating the value of the z-distance parameter with a source of the reflected feature; and (e) if the difference is greater than the threshold, then: (i) making a predetermined adjustment to the value of the z-distance parameter; and (ii) repeating (a) to (d) with the adjusted value of the z-distance parameter.
    Type: Application
    Filed: December 29, 2015
    Publication date: April 21, 2016
    Inventor: Thad Eugene Starner
  • Patent number: 9298256
    Abstract: Methods and devices for initiating, updating, and displaying the results of a search of an object-model database are disclosed. In one embodiment, a method is disclosed that includes receiving video data recorded by a camera on a wearable computing device and, based on the video data, detecting a movement corresponding to a selection of an object. The method further includes, before the movement is complete, initiating a search on the object of an object-model database. The method still further includes, during the movement, periodically updating the search and causing the wearable computing device to overlay the object with object-models from the database corresponding to results of the search.
    Type: Grant
    Filed: January 22, 2015
    Date of Patent: March 29, 2016
    Assignee: Google Inc.
    Inventors: Thad Eugene Starner, Irfan Essa
  • Publication number: 20160086383
    Abstract: Methods and devices for initiating a search of an object are disclosed. In one embodiment, a method is disclosed that includes receiving sensor data from a sensor on a wearable computing device and, based on the sensor data, detecting a movement that defines an outline of an area in the sensor data. The method further includes identifying an object that is located in the area and initiating a search on the object. In another embodiment, a server is disclosed that includes an interface configured to receive sensor data from a sensor on a wearable computing device, at least one processor, and data storage comprising instructions executable by the at least one processor to detect, based on the sensor data, a movement that defines an outline of an area in the sensor data, identify an object that is located in the area, and initiate a search on the object.
    Type: Application
    Filed: December 4, 2015
    Publication date: March 24, 2016
    Inventors: Thad Eugene Starner, Irfan Essa
  • Patent number: 9292082
    Abstract: Example method and systems for text-entry are disclosed. A method may involve displaying a user interface comprising a character line, the character line comprising a plurality of characters in a substantially linear arrangement. The method may then involve receiving a first input via a touch-sensitive input device, wherein the first input comprises a detected movement on a first portion of the touch-sensitive input device. Further, the method may involve, in response to receiving the first input, displaying an indication of a selected character. The method may then involve receiving a second input via the touch-sensitive input device, wherein the second input comprises a detected movement from the first portion to a second portion of the touch-sensitive input device and back to the first portion. The method may then involve displaying the selected character in a text-entry area of the user interface that is separate from the character line.
    Type: Grant
    Filed: November 8, 2011
    Date of Patent: March 22, 2016
    Assignee: Google Inc.
    Inventors: Nirmal Patel, Thad Eugene Starner
  • Patent number: 9277334
    Abstract: A wearable computing device is authenticated using bone conduction. When a user wears the device, a bone conduction speaker and a bone conduction microphone on the device contact the user's head at positions proximate the user's skull. A calibration process is performed by transmitting a signal from the speaker through the skull and receiving a calibration signal at the microphone. An authentication process is subsequently performed by transmitting another signal from the speaker through the skull and an authentication signal is received at the microphone. In the event that frequency response characteristics of the authentication signal match the frequency response characteristics of the calibration signal, the user is authenticated and the device is enabled for user interaction without requiring the user to input any additional data.
    Type: Grant
    Filed: March 21, 2012
    Date of Patent: March 1, 2016
    Assignee: Google Inc.
    Inventors: Adrian Wong, Thad Eugene Starner, Joshua Weaver
  • Patent number: 9274599
    Abstract: The present disclosure provides a computer-implemented method. The method can include receiving eye-scan data from one or more linear sensors while operating in a locked mode, wherein the eye-scan data corresponds to an eye. The method can also include determining that the eye-scan data matches predetermined authorization data. Responsive to determining that the eye-scan data matches the predetermined authorization data, the method can also include causing the computing device to switch from operating in the locked mode to operating in an unlocked mode.
    Type: Grant
    Filed: February 11, 2013
    Date of Patent: March 1, 2016
    Assignee: Google Inc.
    Inventors: Sam D'Amico, Thad Eugene Starner
  • Patent number: 9268024
    Abstract: Exemplary embodiments may involve analyzing reflections from an eye to help determine where the respective sources of the reflections are located.
    Type: Grant
    Filed: September 30, 2012
    Date of Patent: February 23, 2016
    Assignee: Google Inc.
    Inventor: Thad Eugene Starner
  • Patent number: 9268136
    Abstract: Methods and systems are described that involve a head-mountable display (HMD) or an associated device determining the orientation of a person's head relative to their body. To do so, example methods and systems may compare sensor data from the HMD to corresponding sensor data from a tracking device that is expected to move in a manner that follows the wearer's body, such a mobile phone that is located in the HMD wearer's pocket.
    Type: Grant
    Filed: September 28, 2012
    Date of Patent: February 23, 2016
    Assignee: Google Inc.
    Inventors: Thad Eugene Starner, Michael Patrick Johnson
  • Patent number: 9265415
    Abstract: Methods and systems that are described herein may help to dynamically utilize multiple eye-tracking techniques to more accurately determine eye position and/or eye movement. An exemplary system may be configured to: (a) perform at least a first and a second eye-tracking process; (b) determine a reliability indication for at least one of the eye-tracking processes; (c) determine a respective weight for each of the eye-tracking processes based at least in part on the reliability indication; (d) determine a combined eye position based on a weighted combination of eye-position data from the two or more eye-tracking processes, wherein the eye-position data from each eye-tracking process is weighted by the respectively determined weight for the eye-tracking process; and (e) carry out functions based on the combined eye position.
    Type: Grant
    Filed: September 28, 2012
    Date of Patent: February 23, 2016
    Assignee: Google Inc.
    Inventors: Thad Eugene Starner, Bo Wu, Yong Zhao
  • Publication number: 20160011724
    Abstract: Methods and devices for providing a user-interface are disclosed. In one embodiment, the method comprises receiving data corresponding to a first position of a wearable computing device and responsively causing the wearable computing device to provide a user-interface. The user-interfaces comprises a view region and a menu, where the view region substantially fills a field of view of the wearable computing device and the menu is not fully visible in the view region. The method further comprises receiving data indicating a selection of an item present in the view region and causing an indicator to be displayed in the view region, wherein the indicator changes incrementally over a length of time. When the length of time has passed, the method comprises responsively causing the wearable computing device to select the item.
    Type: Application
    Filed: March 2, 2012
    Publication date: January 14, 2016
    Applicant: Google Inc.
    Inventors: Aaron Joseph Wheeler, Sergey Brin, Thad Eugene Starner, Alejandro Kauffmann, Cliff L. Biffle, Liang-Yu (Tom) Chi, Steve Lee, Sebastian Thrun, Luis Ricardo Prada Gomez
  • Publication number: 20160011663
    Abstract: Embodiments may involve a computing device with a mechanical interface, such as a mechanical button or slider. The mechanical interface can be configured to generate, when actuated, vibration and/or acoustic signals having a characteristic pattern. The computing device can detect actuation of the mechanical interface by: receiving acoustic signal data generated by an acoustic sensing unit of the computing device; receiving vibration signal data generated by a vibration sensing unit of the computing device; and determining, based on a comparison of the acoustic and vibration signal data with the characteristic acoustic and vibration patterns, that the mechanical interface has been actuated.
    Type: Application
    Filed: October 23, 2012
    Publication date: January 14, 2016
    Applicant: Google Inc.
    Inventors: Thad Eugene Starner, Michael Patrick Johnson
  • Patent number: 9230171
    Abstract: Methods and devices for initiating a search of an object are disclosed. In one embodiment, a method is disclosed that includes receiving video data from a camera on a wearable computing device and, based on the video data, detecting a movement that defines an outline of an area in the video data. The method further includes identifying an object that is located in the area and initiating a search on the object. In another embodiment, a server is disclosed that includes an interface configured to receive video data from a camera on a wearable computing device, at least one processor, and data storage comprising instructions executable by the at least one processor to detect, based on the video data, a movement that defines an outline of an area in the video data, identify an object that is located in the area, and initiate a search on the object.
    Type: Grant
    Filed: February 20, 2012
    Date of Patent: January 5, 2016
    Assignee: Google Inc.
    Inventors: Thad Eugene Starner, Irfan Essa
  • Patent number: 9213185
    Abstract: A wearable computing system may include a head-mounted display (HMD) with a display configured to display images viewable at a viewing location. When aligned with an HMD wearer's line of sight, the entire display area of the display may be within the HMD wearer's field of view. The area within which an HMD wearer's eye can move and still view the entire display area is termed an “eye box.” However, if the HMD slips up or down, the display area may become obscured, such that the wearer can no longer see the entire image. By scaling or subsetting an image area within the display area, the effective eye box dimensions may increase. Further, in response to movements of the HMD with respect to the wearer, the image area can be adjusted to reduce effects such as vibration and slippage.
    Type: Grant
    Filed: May 8, 2012
    Date of Patent: December 15, 2015
    Assignee: Google Inc.
    Inventors: Thad Eugene Starner, Adrian Wong, Yong Zhao, Chia-Jean Wang, Anurag Gupta, Liang-Yu (Tom) Chi
  • Patent number: 9207760
    Abstract: This disclosure involves proximity sensing of eye gestures using a machine-learned model. An illustrative method comprises receiving training data that includes proximity-sensor data. The data is generated by at least one proximity sensor of a head-mountable device (HMD). The data is indicative of light received by the proximity sensor(s). The light is received by the proximity sensor(s) after a reflection of the light from an eye area. The reflection occurs while an eye gesture is being performed at the eye area. The light is generated by at least one light source of the HMD. The method further comprises applying a machine-learning process to the training data to generate at least one classifier for the eye gesture. The method further comprises generating an eye-gesture model that includes the at least one classifier for the eye gesture. The model is applicable to subsequent proximity-sensor data for detection of the eye gesture.
    Type: Grant
    Filed: September 28, 2012
    Date of Patent: December 8, 2015
    Assignee: Google Inc.
    Inventors: Bo Wu, Yong Zhao, Hartmut Neven, Hayes Solos Raffle, Thad Eugene Starner
  • Patent number: 9202280
    Abstract: Methods and systems are described for determining eye position and/or for determining eye movement based on glints. An exemplary computer-implemented method involves: (a) causing a camera that is attached to a head-mounted display (HMD) to record a video of the eye; (b) while the video of the eye is being recorded, causing a plurality of light sources that are attached to the HMD and generally directed towards the eye to switch on and off according to a predetermined pattern, wherein the predetermined pattern is such that at least two of the light sources are switched on at any given time while the video of the eye is being recorded; (c) analyzing the video of the eye to detect controlled glints that correspond to the plurality of light sources; and (d) determining a measure of eye position based on the controlled glints.
    Type: Grant
    Filed: December 16, 2014
    Date of Patent: December 1, 2015
    Assignee: Google Inc.
    Inventors: Bo Wu, Thad Eugene Starner, Hayes Solos Raffle, Yong Zhao, Edward Allen Keyes
  • Patent number: 9197864
    Abstract: Methods and systems for intelligently zooming to and capturing a first image of a feature of interest are provided. The feature of interest may be determined based on a first interest criteria. The captured image may be provided to a user, who may indicate a level of interest in the feature of interest. The level of interest may be based upon to store the captured image and capture another image. The level of interest may be a gradient value, or a binary value. The level of interest may be based upon to determine whether to store the captured image, and if so, a resolution at which the captured image is to be stored. The level of interest may also be based upon to determine whether to zoom to and capture a second image of a second feature of interest based on the first interest criteria or a second interest criteria.
    Type: Grant
    Filed: September 14, 2012
    Date of Patent: November 24, 2015
    Assignee: GOOGLE INC.
    Inventors: Thad Eugene Starner, Joshua Weaver
  • Publication number: 20150310762
    Abstract: Disclosed herein are methods, systems, computer readable media, and apparatuses for conveying chorded input to a user. Chorded input can be conveyed by one or more sequences of stimulation events, wherein each sequence represents a particular chorded input.
    Type: Application
    Filed: April 29, 2015
    Publication date: October 29, 2015
    Inventors: Caitlyn Seim, Thad Eugene Starner
  • Patent number: 9164588
    Abstract: Methods, apparatus, and computer-readable media are described herein related to recognizing a look up gesture. Level-indication data from at least an accelerometer associated with a wearable computing device (WCD) can be received. The WCD can be worn by a wearer. The WCD can determine whether a head of the wearer is level based on the level-indication data. In response to determining that the head of the wearer is level, the WCD can receive lookup-indication data from at least the accelerometer. The WCD can determine whether the head of the wearer is tilted up based on the lookup-indication data. In response to determining that the head of the wearer is tilted up, the WCD can generate a gesture-recognition trigger, where the gesture-recognition trigger indicates that the head of the wearer has moved up from level.
    Type: Grant
    Filed: February 5, 2013
    Date of Patent: October 20, 2015
    Assignee: Google Inc.
    Inventors: Michael Patrick Johnson, Mat Balez, David Sparks, Thad Eugene Starner