Patents by Inventor Hayes Solos Raffle

Hayes Solos Raffle has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20160077337
    Abstract: An example method includes receiving, by a head-mountable device (HMD), data corresponding to an information event, and providing an indication corresponding to the information event in response to receiving the data. The method further includes determining a gaze direction of an eye and determining that the gaze direction of the eye is an upward direction that corresponds to a location of a display of the HMD. The display is located in an upper periphery of a forward-looking field of view of the eye when the HMD is worn. The method further includes, in response to determining that the gaze direction of the eye is the upward direction, displaying graphical content related to the information event in the display.
    Type: Application
    Filed: September 15, 2014
    Publication date: March 17, 2016
    Inventors: Hayes Solos Raffle, Michael Patrick Johnson, Alok Chandel, Chun Yat Frank Li
  • Patent number: 9285872
    Abstract: Embodiments described herein may help to provide a wake-up mechanism for a computing device. An example method involves, the computing device: (a) receiving head-movement data that is indicative of head movement; (b) detecting at least a portion of the head-movement data that is indicative of a head gesture; (c) receiving eye-position data that is indicative of eye position; (d) detecting at least a portion of the eye-position data that is indicative of an eye being directed towards a display of a head-mounted device (HMD); and (e) causing the HMD to switch from a first operating mode to a second operating mode in response to the detection of both: (i) the eye-movement data that is indicative of an eye directed towards the display, and (ii) the head-movement data indicative of the head gesture.
    Type: Grant
    Filed: December 12, 2013
    Date of Patent: March 15, 2016
    Assignee: Google Inc.
    Inventors: Hayes Solos Raffle, Chun Yat Frank Li
  • Publication number: 20160057339
    Abstract: This disclosure relates to winking to capture image data using an image capture device that is associated with a head-mountable device (HMD). An illustrative method includes detecting a wink gesture at an HMD. The method also includes causing an image capture device to capture image data, in response to detecting the wink gesture at the HMD.
    Type: Application
    Filed: October 19, 2015
    Publication date: February 25, 2016
    Inventors: Hayes Solos Raffle, Sergey Brin, Bo Wu, Michael Patrick Johnson, David Sparks
  • Patent number: 9261700
    Abstract: Embodiments described herein may provide a configuration of input interfaces used to perform multi-touch operations. An example device may involve: (a) a housing arranged on a head-mountable device, (b) a first input interface arranged on either a superior or an inferior surface of the housing, (c) a second input interface arranged on a surface of the housing that is opposite to the first input interface, and (d) a control system configured to: (1) receive first input data from the first input interface, where the first input data corresponds to a first input action, and in response, cause a camera to perform a first operation in accordance with the first input action, and (2) receive second input data from the second input interface, where the second input data corresponds to a second input action(s) on the second input interface, and in response, cause the camera to perform a second operation.
    Type: Grant
    Filed: November 20, 2013
    Date of Patent: February 16, 2016
    Assignee: Google Inc.
    Inventors: Chun Yat Frank Li, Hayes Solos Raffle
  • Patent number: 9241209
    Abstract: A wearable audio component includes a first cable and an audio source in electrical communication with the first cable. A housing defines an interior and an exterior, the audio source being contained within the interior thereof. The exterior includes an ear engaging surface, an outer surface, and a peripheral surface extending between the front and outer surfaces. The peripheral surface includes a channel open along a length to surrounding portions of the peripheral surface and having a depth to extend partially between the front and outer surfaces. A portion of the channel is covered by a bridge member that defines an aperture between and open to adjacent portions of the channel. The cable is connected with the housing at a first location disposed within the channel remote from the bridge member and is captured in so as to extend through the aperture in a slidable engagement therewith.
    Type: Grant
    Filed: December 30, 2013
    Date of Patent: January 19, 2016
    Assignee: Google Inc.
    Inventors: Haley Toelle, Jianchun Dong, Michael Kai Morishita, Eliot Kim, Hayes Solos Raffle, Livius Dumitru Chebeleu
  • Patent number: 9223401
    Abstract: Systems and methods for navigation and selection with eye gestures in a graphical display are provided, in which a graphical display of content may be provided to a wearer of a head-mountable device. A first signal may be received from a first sensor configured to detect an eye gesture of a first eye of the wearer. A second signal may then be received from a second sensor configured to detect an eye gesture of a second eye of the wearer. The first signal may be associated to a first command to navigate through or select an item in the graphical display of content provided by the head-mountable device, and the second signal may be associated to a subsequent command, based at least in part on a given output of the first command, to navigate through or select another item in the graphical display of content.
    Type: Grant
    Filed: February 25, 2015
    Date of Patent: December 29, 2015
    Assignee: Google Inc.
    Inventors: Hayes Solos Raffle, Nirmal Patel
  • Patent number: 9223451
    Abstract: Embodiments described herein may allow for the use of active capacitive sensing on a head-mountable device. An example method may involve: sending a first signal that has a first frequency from a signal transmitter positioned on a wearable computing device so that when the wearable computing device is worn, the signal transmitter couples to a part of a wearer of the wearable computing device, receiving a second signal at a capacitive sensor located on the wearable computing device, determining whether the second signal has the first frequency, if the second signal has the first frequency, outputting a third signal that is indicative of manual input on the capacitive sensor, and if the second signal does not have the first frequency, refraining from outputting the third signal.
    Type: Grant
    Filed: October 25, 2013
    Date of Patent: December 29, 2015
    Assignee: Google Inc.
    Inventors: Hayes Solos Raffle, Timothy John Prachar
  • Patent number: 9213403
    Abstract: Methods, apparatus, and computer-readable media are described herein related to displaying and cropping viewable objects. A viewable object can be displayed on a display of a head-mountable device (HMD) configured with a hand-movement input device. The HMD can receive both head-movement data corresponding to head movements and hand-movement data from the hand-movement input device. The viewable object can be panned on the display based on the head-movement data. The viewable object can be zoomed on the display based on the hand-movement data. The HMD can receive an indication that navigation of the viewable object is complete. The HMD can determine whether a cropping mode is activated. After determining that the cropping mode is activated, the HMD can generate a cropped image of the viewable object on the display when navigation is complete and perform an operation on the cropped image.
    Type: Grant
    Filed: March 27, 2013
    Date of Patent: December 15, 2015
    Assignee: Google Inc.
    Inventors: Hayes Solos Raffle, Nirmal Patel, Max Benjamin Braun
  • Patent number: 9207760
    Abstract: This disclosure involves proximity sensing of eye gestures using a machine-learned model. An illustrative method comprises receiving training data that includes proximity-sensor data. The data is generated by at least one proximity sensor of a head-mountable device (HMD). The data is indicative of light received by the proximity sensor(s). The light is received by the proximity sensor(s) after a reflection of the light from an eye area. The reflection occurs while an eye gesture is being performed at the eye area. The light is generated by at least one light source of the HMD. The method further comprises applying a machine-learning process to the training data to generate at least one classifier for the eye gesture. The method further comprises generating an eye-gesture model that includes the at least one classifier for the eye gesture. The model is applicable to subsequent proximity-sensor data for detection of the eye gesture.
    Type: Grant
    Filed: September 28, 2012
    Date of Patent: December 8, 2015
    Assignee: Google Inc.
    Inventors: Bo Wu, Yong Zhao, Hartmut Neven, Hayes Solos Raffle, Thad Eugene Starner
  • Patent number: 9202280
    Abstract: Methods and systems are described for determining eye position and/or for determining eye movement based on glints. An exemplary computer-implemented method involves: (a) causing a camera that is attached to a head-mounted display (HMD) to record a video of the eye; (b) while the video of the eye is being recorded, causing a plurality of light sources that are attached to the HMD and generally directed towards the eye to switch on and off according to a predetermined pattern, wherein the predetermined pattern is such that at least two of the light sources are switched on at any given time while the video of the eye is being recorded; (c) analyzing the video of the eye to detect controlled glints that correspond to the plurality of light sources; and (d) determining a measure of eye position based on the controlled glints.
    Type: Grant
    Filed: December 16, 2014
    Date of Patent: December 1, 2015
    Assignee: Google Inc.
    Inventors: Bo Wu, Thad Eugene Starner, Hayes Solos Raffle, Yong Zhao, Edward Allen Keyes
  • Patent number: 9201512
    Abstract: Disclosed is a technique that can help to detect a blink of an eye and a direction along which the eye is oriented before, after, or during the blink. To this end, light data can be received from at least one light sensor. The light data indicates at least one characteristic of light reflected from an eye area. A blink event can be detected based on the light data. A gaze direction can be determined based on the blink event. At least one computing action can be performed based on the gaze direction.
    Type: Grant
    Filed: July 16, 2012
    Date of Patent: December 1, 2015
    Assignee: Google Inc.
    Inventors: Hayes Solos Raffle, Yong Zhao
  • Patent number: 9204397
    Abstract: Embodiments of the disclosure describe an on-head detection technique for an HMD that includes an optical sensor positioned to detect light reflected from at least one of a face of a user or a lens worn by the user. A flexible frame assembly supports an image source and further supports the optical sensor relative to the face or the lens when worn by the user. The flexible frame assembly flexes such that the optical sensor moves closer to at least one of the face or the lens when the HMD is worn by the user. Embodiments determine whether the user is wearing the HMD based on optical sensor data output from the optical sensor.
    Type: Grant
    Filed: January 5, 2015
    Date of Patent: December 1, 2015
    Assignee: Google Inc.
    Inventors: Hayes Solos Raffle, Matthew Wyatt Martin
  • Patent number: 9171198
    Abstract: This disclosure relates to winking to capture image data using an image capture device that is associated with a head-mountable device (HMD). An illustrative method includes detecting a wink gesture at an HMD. The method also includes causing an image capture device to capture image data, in response to detecting the wink gesture at the HMD.
    Type: Grant
    Filed: July 31, 2012
    Date of Patent: October 27, 2015
    Assignee: Google Inc.
    Inventors: Hayes Solos Raffle, Sergey Brin, Bo Wu, Michael Patrick Johnson, David Sparks
  • Patent number: 9128522
    Abstract: This disclosure relates to proximity sensing for wink detection. An illustrative method includes receiving data from a receiver portion of a proximity sensor. The receiver portion is disposed at a side section of a head-mountable device (HMD). When a wearer wears the HMD, the receiver portion is arranged to receive light reflected from an eye area of the wearer, the proximity sensor detects a movement of the eye area, and the data represents the movement. The method includes determining that the data corresponds to a wink gesture. The method also includes selecting a computing action to perform, based on the wink gesture. The method further includes performing the computing action.
    Type: Grant
    Filed: July 17, 2012
    Date of Patent: September 8, 2015
    Assignee: Google Inc.
    Inventors: Hayes Solos Raffle, Michael Patrick Johnson, David Sparks, Bo Wu
  • Publication number: 20150242414
    Abstract: Methods and devices for initiating a search of an object are disclosed. In one embodiment, a method is disclosed that includes receiving video data recorded by a camera on a wearable computing device, where the video data comprises at least a first frame and a second frame. The method further includes, based on the video data, detecting an area in the first frame that is at least partially bounded by a pointing device and, based on the video data, detecting in the second frame that the area is at least partially occluded by the pointing device. The method still further includes initiating a search on the area.
    Type: Application
    Filed: May 7, 2015
    Publication date: August 27, 2015
    Inventors: Thad Eugene Starner, Irfan Essa, Hayes Solos Raffle, Daniel Aminzade
  • Patent number: 9116545
    Abstract: Example methods and systems determine viewing states, blinks, and blink intervals of an eye of a wearer of a head-mountable device. The head-mountable display can emit IR radiation from an associated IR radiation source toward a target location. An IR sensor associated with the head-mountable display can receive reflected IR radiation, such as the IR radiation emitted by the IR radiation source and reflected from the target location. The IR sensor can generate amplitude data for the reflected IR radiation. The head-mountable display can be used to determine a viewing state of the target location. The viewing state can be based on the amplitude data. The viewing state can determined from among a no-wearer-present viewing state, a wearer-present viewing state, a closed-eye viewing state, an open-eye viewing state, a non-display-viewing viewing state, and a display-viewing viewing state.
    Type: Grant
    Filed: March 21, 2012
    Date of Patent: August 25, 2015
    Inventors: Hayes Solos Raffle, Cliff L. Biffle
  • Publication number: 20150193098
    Abstract: Methods and systems disclosed herein relate to an action that could proceed or be dismissed in response to an affirmative or negative input, respectively. An example method could include displaying, using a head-mountable device, a graphical interface that presents a graphical representation of an action. The action could relate to at least one of a contact, a contact's avatar, a media file, a digital file, a notification, and an incoming communication. The example method could further include receiving a binary selection from among an affirmative input and a negative input. The example method may additionally include proceeding with the action in response to the binary selection being the affirmative input and dismissing the action in response to the binary selection being the negative input.
    Type: Application
    Filed: March 23, 2012
    Publication date: July 9, 2015
    Applicant: GOOGLE INC.
    Inventors: Alejandro Kauffmann, Hayes Solos Raffle, Aaron Joseph Wheeler, Luis Ricardo Prada Gomez, Steven John Lee
  • Publication number: 20150169054
    Abstract: A wearable computing device or a head-mounted display (HMD) may be configured to track the gaze axis of an eye of the wearer. In particular, the device may be configured to observe movement of a wearer's pupil and, based on the movement, determine inputs to a user interface. For example, using eye gaze detection, the HMD may change a tracking rate of a displayed virtual image based on where the user is looking. Gazing at the center of the HMD field of view may, for instance, allow for fine movements of the virtual display. Gazing near an edge of the HMD field of view may provide coarser movements.
    Type: Application
    Filed: January 26, 2015
    Publication date: June 18, 2015
    Inventors: Aaron Joseph Wheeler, Hayes Solos Raffle
  • Publication number: 20150160461
    Abstract: Example methods and devices are disclosed for generating life-logs with point-of-view images. An example method may involve: receiving image-related data based on electromagnetic radiation reflected from a human eye, generating an eye reflection image based on the image-related data, generating a point-of-view image by filtering the eye reflection image, and storing the point-of-view image. The electromagnetic radiation reflected from a human eye can be captured using one or more video or still cameras associated with a suitably-configured computing device, such as a wearable computing device.
    Type: Application
    Filed: February 20, 2015
    Publication date: June 11, 2015
    Inventors: Thad Eugene Starner, Hayes Solos Raffle, Yong Zhao
  • Patent number: 9052804
    Abstract: Methods and devices for initiating a search of an object are disclosed. In one embodiment, a method is disclosed that includes receiving video data recorded by a camera on a wearable computing device, where the video data comprises at least a first frame and a second frame. The method further includes, based on the video data, detecting an area in the first frame that is at least partially bounded by a pointing device and, based on the video data, detecting in the second frame that the area is at least partially occluded by the pointing device. The method still further includes initiating a search on the area.
    Type: Grant
    Filed: March 12, 2012
    Date of Patent: June 9, 2015
    Assignee: Google Inc.
    Inventors: Thad Eugene Starner, Irfan Essa, Hayes Solos Raffle, Daniel Aminzade