Patents by Inventor Hayes Solos Raffle
Hayes Solos Raffle has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 9746915Abstract: Examples of methods and systems for providing calibration for eye gesture recognition are described. In some examples, calibration can be executed via a head-mountable device. A method for calibration of a system may account for changes in orientation of the head-mountable device, update recognition of the eye gestures, or increase efficiency of the system, for example. The head-mountable device may be configured to receive signals indicative of eye gestures from an eye gesture-detection system and in response to receiving a second command confirming that the signal is indicative of an eye gesture command, to make adjustments to the eye gesture recognition system and/or the reference signals. The head-mountable device may calibrate an eye gesture recognition system via implicit or explicit calibration, for example.Type: GrantFiled: October 22, 2012Date of Patent: August 29, 2017Assignee: Google Inc.Inventors: Hayes Solos Raffle, Thad Eugene Starner, Mat Balez, Michael Patrick Johnson, Yong Zhao, Bo Wu, David Sparks, Nirmal J. Patel
-
Publication number: 20170229095Abstract: Disclosed are methods and devices for varying functionality of a wearable computing device. An example device includes a first sensor and a second sensor. An example method includes, while a device is operating in a first state, receiving an indication of a touch input at the first sensor. The second sensor is configured in an idle mode based on the device operating in the first state. The method further includes, in response to receiving the indication of the touch input, triggering the second sensor to operate in an active mode and receiving data from the second sensor. The method further includes determining, based on the data, whether the device is being worn.Type: ApplicationFiled: April 26, 2017Publication date: August 10, 2017Inventors: Hayes Solos Raffle, David Sparks, Bo Wu
-
Patent number: 9684374Abstract: Example methods and devices are disclosed for generating life-logs with point-of-view images. An example method may involve: receiving image-related data based on electromagnetic radiation reflected from a human eye, generating an eye reflection image based on the image-related data, generating a point-of-view image by filtering the eye reflection image, and storing the point-of-view image. The electromagnetic radiation reflected from a human eye can be captured using one or more video or still cameras associated with a suitably-configured computing device, such as a wearable computing device.Type: GrantFiled: February 20, 2015Date of Patent: June 20, 2017Assignee: Google Inc.Inventors: Thad Eugene Starner, Hayes Solos Raffle, Yong Zhao
-
Publication number: 20170171656Abstract: A wearable audio component includes a first cable and an audio source in electrical communication with the first cable. A housing defines an interior and an exterior, the audio source being contained within the interior thereof. The exterior includes an ear engaging surface, an outer surface, and a peripheral surface extending between the front and outer surfaces. The peripheral surface includes a channel open along a length to surrounding portions of the peripheral surface and having a depth to extend partially between the front and outer surfaces. A portion of the channel is covered by a bridge member that defines an aperture between and open to adjacent portions of the channel The cable is connected with the housing at a first location disposed within the channel remote from the bridge member and is captured in so as to extend through the aperture in a slidable engagement therewith.Type: ApplicationFiled: February 27, 2017Publication date: June 15, 2017Inventors: Haley Toelle, Jianchun Dong, Michael Kai Morishita, Eliot Kim, Hayes Solos Raffle, Livius Dumitru Chebeleu
-
Publication number: 20170163866Abstract: The present disclosure provides a computing device including an image-capture device and a control system. The control system may be configured to receive sensor data from one or more sensors, and analyze the sensor data to detect at least one image-capture signal. The control system may also be configured to cause the image-capture device to capture an image in response to detection of the at least one image-capture signal. The control system may also be configured to enable one or more speech commands relating to the image-capture device in response to capturing the image. The control system may also be configured to receive one or more verbal inputs corresponding to the one or more enabled speech commands. The control system may also be configured to perform an image-capture function corresponding to the one or more verbal inputs.Type: ApplicationFiled: July 24, 2013Publication date: June 8, 2017Applicant: Google Inc.Inventors: Michael Patrick Johnson, Bo Wu, David Sparks, Hayes Solos Raffle
-
Patent number: 9664902Abstract: Disclosed are methods and devices for varying functionality of a wearable computing device. An example device includes a first sensor and a second sensor. An example method includes, while a device is operating in a first state, receiving an indication of a touch input at the first sensor. The second sensor is configured in an idle mode based on the device operating in the first state. The method further includes, in response to receiving the indication of the touch input, triggering the second sensor to operate in an active mode and receiving data from the second sensor. The method further includes determining, based on the data, whether the device is being worn.Type: GrantFiled: February 5, 2014Date of Patent: May 30, 2017Assignee: Google Inc.Inventors: Hayes Solos Raffle, David Sparks, Bo Wu
-
Publication number: 20170090557Abstract: This disclosure relates to example implementations for side-mounted optical sensors for eye gestures on a head mountable display. An example wearable computing device may include a wearable frame structure that includes a front portion and at least one side arm. In some instances, ends of the side arms may couple and extend away from the front portion at a coupling point. Additionally, the example device may include optical elements coupled to the front portion and may further include one or more sensors arranged on an inner surface of a side arm proximal to the coupling point. The sensors may be oriented to receive sensor data from at least one eye region when the wearable computing device is worn.Type: ApplicationFiled: April 29, 2014Publication date: March 30, 2017Applicant: Google Inc.Inventors: Hayes Solos Raffle, Bo Wu, David Sparks, Seungyon Lee, Peter Michael Cazalet
-
Publication number: 20170031435Abstract: Example embodiments include a lens having an IR-reflective coating that is selectively applied to form a variable infrared (IR) interaction pattern on the lens. The variable IR interaction pattern may vary in the manner it interacts with IR wavelengths, so as to provide a machine-readable code when the lens is illuminated by IR light. Accordingly, variable IR interaction patterns may be used to identify particular lenses. Accordingly, a glasses-style, modular, head-mountable device (HMD) may identify which of a number of different possible lenses are currently attached to the HMD, and update certain processes according to the lens or lenses is or are attached. For example, an HMD may calibrate an eye-tracking process according to the particular lens that is attached.Type: ApplicationFiled: July 31, 2015Publication date: February 2, 2017Inventors: Hayes Solos Raffle, Simon Robert Prakash
-
Patent number: 9547365Abstract: An example method includes receiving, by a head-mountable device (HMD), data corresponding to an information event, and providing an indication corresponding to the information event in response to receiving the data. The method further includes determining a gaze direction of an eye and determining that the gaze direction of the eye is an upward direction that corresponds to a location of a display of the HMD. The display is located in an upper periphery of a forward-looking field of view of the eye when the HMD is worn. The method further includes, in response to determining that the gaze direction of the eye is the upward direction, displaying graphical content related to the information event in the display.Type: GrantFiled: September 15, 2014Date of Patent: January 17, 2017Assignee: Google Inc.Inventors: Hayes Solos Raffle, Michael Patrick Johnson, Alok Chandel, Chun Yat Frank Li
-
Patent number: 9535519Abstract: Embodiments described herein may help to provide an extension of touchpad sensing to adjacent surfaces. An example device may involve: (a) a touchpad having a first surface, (b) at least one electrode coupled to at least one location on the first surface, where at least a portion of the at least one electrode is arranged on a second surface that is adjacent to the first surface so that a touch to the portion of one of the electrodes on the second surface causes the touchpad to output data relating to the corresponding location on the first surface, and (c) and a control system configured to: (1) receive a first signal that is indicative of touch input on the touchpad, and (2) detect, in the first signal, data relating to the corresponding location on the first surface and responsively output a second signal relating to a touch of one of the electrodes on the second surface.Type: GrantFiled: June 14, 2013Date of Patent: January 3, 2017Assignee: Google Inc.Inventors: Hayes Solos Raffles, Russell Norman Mirov
-
Publication number: 20160357266Abstract: Methods and systems for hands-free browsing in a wearable computing device are provided. A wearable computing device may provide for display a view of a first card of a plurality of cards which include respective virtual displays of content. The wearable computing device may determine a first rotation of the wearable computing device about a first axis and one or more eye gestures. Based on a combination of the first rotation and the eye gestures, the wearable computing device may provide for display the navigable menu, which may include an alternate view of the first card and at least a portion of one or more cards. Then, based on a determined second rotation of the wearable computing device about a second axis and based on a direction of the second rotation, the wearable computing device may generate a display indicative of navigation through the navigable menu.Type: ApplicationFiled: August 16, 2016Publication date: December 8, 2016Inventors: Nirmal Patel, Hayes Solos Raffle, Mat Balez, Max Benjamin Braun, Jerrica Jones
-
Patent number: 9507426Abstract: Methods, apparatus, and computer-readable media are described herein related to a user interface (UI) for a computing device, such as a head-mountable device (HMD). The computing device can detect a communication event. In response to the communication event, the computing device can display a first item having a current size on a display associated with a display plane. A hand-movement input device associated with the computing device can receive a first input indicative of a gesture toward the display plane. In response to receiving the first input, the computing device can display a first change to the current size of the first item. The hand-movement input device can receive a second input indicative of a gesture away from the display plane. In response to the second input, the computing device can display a second change to the current size of the first item.Type: GrantFiled: March 27, 2013Date of Patent: November 29, 2016Assignee: Google Inc.Inventor: Hayes Solos Raffle
-
Patent number: 9454288Abstract: Methods, apparatus, and computer-readable media are described herein related to a user interface (UI) for a computing device, such as a head-mountable device (HMD). The computing device can display a first card of an ordered plurality of cards using a timeline display. The computing device can receive a first input and, responsively determine a group of cards for a grid view and display the grid view. The group of cards can include the first card. The grid view can include the group of cards arranged in a grid and be focused on the first card. The computing device can receive a second input, and responsively modify the grid view and display the modified grid view. The modified grid view can be focused on a second card. The computing device can receive a third input and responsively display the timeline display, where the timeline display includes the second card.Type: GrantFiled: May 8, 2013Date of Patent: September 27, 2016Assignee: Google Inc.Inventors: Hayes Solos Raffle, Nirmal Patel
-
Patent number: 9448687Abstract: A device may be configured to generate a display of a content in a view region. The view region has a given boundary, and the content is provided for display at a first size. The device may be configured to receive a size-adjusting input for adjusting a size of the content. The device may be configured to cause the size of the content to change from the first size to a second size. The device may be configured to receive an indication of ending the size-adjusting operation, and in response to receiving the indication, cause the size of the content to change to a third size that is between the first size and the second size. In addition, or alternative, to receiving the size-adjusting input, the device may also receive a position-adjusting input and, in response, adjust the position and/or size of the content within the given boundary.Type: GrantFiled: February 5, 2014Date of Patent: September 20, 2016Assignee: Google Inc.Inventors: Christopher McKenzie, Hayes Solos Raffle, Nirmal Patel, Richard The, Matthew Tait
-
Patent number: 9442631Abstract: Methods and systems for hands-free browsing in a wearable computing device are provided. A wearable computing device may provide for display a view of a first card of a plurality of cards which include respective virtual displays of content. The wearable computing device may determine a first rotation of the wearable computing device about a first axis and one or more eye gestures. Based on a combination of the first rotation and the eye gestures, the wearable computing device may provide for display the navigable menu, which may include an alternate view of the first card and at least a portion of one or more cards. Then, based on a determined second rotation of the wearable computing device about a second axis and based on a direction of the second rotation, the wearable computing device may generate a display indicative of navigation through the navigable menu.Type: GrantFiled: March 17, 2014Date of Patent: September 13, 2016Assignee: Google Inc.Inventors: Nirmal Patel, Hayes Solos Raffle, Mat Balez, Max Benjamin Braun, Jerrica Jones
-
Publication number: 20160252956Abstract: A wearable computing device or a head-mounted display (HMD) may be configured to track the gaze axis of an eye of the wearer. In particular, the device may be configured to observe movement of a wearer's pupil and, based on the movement, determine inputs to a user interface. For example, using eye gaze detection, the HMD may change a tracking rate of a displayed virtual image based on where the user is looking. Gazing at the center of the HMD field of view may, for instance, allow for fine movements of the virtual display. Gazing near an edge of the HMD field of view may provide coarser movements.Type: ApplicationFiled: April 11, 2016Publication date: September 1, 2016Inventors: Aaron Joseph Wheeler, Hayes Solos Raffle
-
Publication number: 20160147309Abstract: Embodiments described herein may provide a configuration of input interfaces used to perform multi-touch operations. An example device may involve: (a) a housing arranged on a head-mountable device, (b) a first input interface arranged on either a superior or an inferior surface of the housing, (c) a second input interface arranged on a surface of the housing that is opposite to the first input interface, and (d) a control system configured to: (1) receive first input data from the first input interface, where the first input data corresponds to a first input action, and in response, cause a camera to perform a first operation in accordance with the first input action, and (2) receive second input data from the second input interface, where the second input data corresponds to a second input action(s) on the second input interface, and in response, cause the camera to perform a second operation.Type: ApplicationFiled: February 1, 2016Publication date: May 26, 2016Inventors: Chun Yat Frank Li, Hayes Solos Raffle
-
Publication number: 20160147086Abstract: A head-wearable device includes a center support extending in generally lateral directions, a first side arm extending from a first end of the center frame support and a second side arm extending from a second end of the center support. The device may further include a nosebridge that is removably coupled to the center frame support. The device may also include a lens assembly that is removably coupled to the center support or the nosebridge. The lens assembly may have a single lens, or a multi-lens arrangement configured to cooperate with display to correct for a user's ocular disease or disorder.Type: ApplicationFiled: January 29, 2016Publication date: May 26, 2016Inventors: Peter Michael Cazalet, Joseph John Hebenstreit, Matthew Wyatt Martin, Maj Isabelle Olsson, Mitchell Joseph Heinrich, Hayes Solos Raffle, Eliot Kim
-
Publication number: 20160103483Abstract: Methods, apparatus, and computer-readable media are described herein related to displaying and cropping viewable objects. A viewable object can be displayed on a display of a head-mountable device (HMD) configured with a hand-movement input device. The HMD can receive both head-movement data corresponding to head movements and hand-movement data from the hand-movement input device. The viewable object can be panned on the display based on the head-movement data. The viewable object can be zoomed on the display based on the hand-movement data. The HMD can receive an indication that navigation of the viewable object is complete. The HMD can determine whether a cropping mode is activated. After determining that the cropping mode is activated, the HMD can generate a cropped image of the viewable object on the display when navigation is complete.Type: ApplicationFiled: December 14, 2015Publication date: April 14, 2016Inventors: Hayes Solos Raffle, Nirmal Patel, Max Benjamin Braun
-
Publication number: 20160080850Abstract: A wearable audio component includes a first cable and an audio source in electrical communication with the first cable. A housing defines an interior and an exterior, the audio source being contained within the interior thereof. The exterior includes an ear engaging surface, an outer surface, and a peripheral surface extending between the front and outer surfaces. The peripheral surface includes a channel open along a length to surrounding portions of the peripheral surface and having a depth to extend partially between the front and outer surfaces. A portion of the channel is covered by a bridge member that defines an aperture between and open to adjacent portions of the channel. The cable is connected with the housing at a first location disposed within the channel remote from the bridge member and is captured in so as to extend through the aperture in a slidable engagement therewith.Type: ApplicationFiled: November 12, 2015Publication date: March 17, 2016Inventors: Haley Toelle, Jianchun Dong, Michael Kai Morishita, Eliot Kim, Hayes Solos Raffle, Livius Dumitru Chebeleu