Patents by Inventor Chun Yat Frank Li

Chun Yat Frank Li has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230388784
    Abstract: The present disclosure provides systems and methods for wirelessly coupling one or more accessories with a host device. The one or more accessories may be classified by the host device as primary, saved, or unsaved devices. Primary and saved devices may automatically couple to the host device via a short range wireless communications interface. Thus, the host device may simultaneously be coupled to one or more accessories. The host device may output first content to a first accessory, such as a saved accessory. The host device may receive second content and output the second content to a second accessory while simultaneously outputting the first content to the first accessory.
    Type: Application
    Filed: August 10, 2023
    Publication date: November 30, 2023
    Applicant: Google LLC
    Inventors: Basheer Tome, Sandeep Singh Waraich, Chun Yat Frank Li
  • Publication number: 20210314768
    Abstract: The present disclosure provides systems and methods for wirelessly coupling one or more accessories with a host device. The one or more accessories may be classified by the host device as primary, saved, or unsaved devices. Primary and saved devices may automatically couple to the host device via a short range wireless communications interface. Thus, the host device may simultaneously be coupled to one or more accessories. The host device may output first content to a first accessory, such as a saved accessory. The host device may receive second content and output the second content to a second accessory while simultaneously outputting the first content to the first accessory.
    Type: Application
    Filed: April 1, 2020
    Publication date: October 7, 2021
    Inventors: Basheer Tome, Sandeep Singh Waraich, Chun Yat Frank Li
  • Patent number: 10599259
    Abstract: A method that includes employing several sensors associated with a handheld controller, where each of the sensors is made of one of a hover, touch, and force/pressure sensor, and generating, by one or more of the sensors, sensor data associated with the position of a user's hand and finger in relation to the handheld controller. The method continues with combining the sensor data from several sensors to form aggregate sensor data, sending the aggregate sensor data to a processor, and generating an estimated position of the user's hand and fingers based on the aggregate sensor data.
    Type: Grant
    Filed: August 24, 2018
    Date of Patent: March 24, 2020
    Assignee: GOOGLE LLC
    Inventors: Debanjan Mukherjee, Basheer Tome, Sarnab Bhattacharya, Bhaskar Vadathavoor, Shiblee Imtiaz Hasan, Chun Yat Frank Li
  • Patent number: 10592048
    Abstract: A method for aligning an image on a mobile device disposed within a head-mounted display (HMD) housing includes: detecting a request to align an image on a touchscreen of a mobile device; detecting, on the touchscreen, a first detected location corresponding to a first touchscreen input event; determining a first displacement of the first detected location with respect to a first target location of the first touchscreen input event; and transposing the image on the touchscreen based on the first displacement. A virtual reality system includes: a mobile device having a touchscreen configured to display an image; and a HMD housing having a first contact configured to generate a first input event at a first location on the touchscreen when the mobile device is disposed within the HMD housing.
    Type: Grant
    Filed: May 3, 2017
    Date of Patent: March 17, 2020
    Assignee: GOOGLE LLC
    Inventors: Chun Yat Frank Li, Hayes S. Raffle, Eric Allan MacIntosh
  • Patent number: 10545584
    Abstract: A controller configured to control a pointer in a virtual reality environment includes a multi-axis magnetic field sensor, a multi-axis accelerometer, a gyroscope, a touchpad, and a wireless communications circuit. The controller can also include a processor and a memory storing instructions that when executed by the processor, cause the processor to obtain geomagnetic field data from the multi-axis magnetic field sensor, obtain acceleration data describing a direction and a magnitude of force affecting the controller from the multi-axis accelerometer, and obtain angular velocity data describing a rotational position of the controller from the gyroscope. The processor can communicate movement data to a computing device configured to generate a rendering of the virtual reality environment, the movement data describing an orientation of the controller wherein the movement data is based on at least one of the geomagnetic field data, the acceleration data, or the angular velocity data.
    Type: Grant
    Filed: May 17, 2017
    Date of Patent: January 28, 2020
    Assignee: GOOGLE LLC
    Inventors: Basheer Tome, Hayes S. Raffle, Chun Yat Frank Li
  • Patent number: 10509487
    Abstract: A system for combining a gyromouse input with a touch surface input in an augmented reality (AR) environment and/or a virtual reality (VR) environment, a virtual display of virtual items and/or features may be adjusted in response to movement of the gyromouse combined with touch inputs, or touch and drag inputs, received on a touch surface of the gyromouse. Use of the gyromouse in the AR/VR environment may allow touch screen capabilities to be accurately projected into a three dimensional virtual space, providing a controller having improved functionality and utility in the AR/VR environment, and enhancing the user's experience.
    Type: Grant
    Filed: December 15, 2016
    Date of Patent: December 17, 2019
    Assignee: GOOGLE LLC
    Inventors: David Dearman, Chun Yat Frank Li
  • Patent number: 10475254
    Abstract: Systems, devices, methods, computer program products, and electronic apparatuses for aligning components in virtual reality environments are provided. An example method includes detecting a first input from a handheld controller of a virtual reality system, responsive to detecting the first input, instructing a user to orient a handheld controller in a designated direction, detecting a second input from the handheld controller; and responsive to detecting the second input, storing alignment data representative of an alignment of the handheld controller.
    Type: Grant
    Filed: January 4, 2019
    Date of Patent: November 12, 2019
    Assignee: Google LLC
    Inventors: David Dearman, Chun Yat Frank Li, Erica Morse
  • Publication number: 20190155439
    Abstract: A method that includes employing several sensors associated with a handheld controller, where each of the sensors is made of one of a hover, touch, and force/pressure sensor, and generating, by one or more of the sensors, sensor data associated with the position of a user's hand and finger in relation to the handheld controller. The method continues with combining the sensor data from several sensors to form aggregate sensor data, sending the aggregate sensor data to a processor, and generating an estimated position of the user's hand and fingers based on the aggregate sensor data.
    Type: Application
    Filed: August 24, 2018
    Publication date: May 23, 2019
    Inventors: Debanjan MUKHERJEE, Basheer TOME, Sarnab BHATTACHARYA, Bhaskar VADATHAVOOR, Shiblee Imtiaz HASAN, Chun Yat Frank LI
  • Publication number: 20190139323
    Abstract: Systems, devices, methods, computer program products, and electronic apparatuses for aligning components in virtual reality environments are provided. An example method includes detecting a first input from a handheld controller of a virtual reality system, responsive to detecting the first input, instructing a user to orient a handheld controller in a designated direction, detecting a second input from the handheld controller; and responsive to detecting the second input, storing alignment data representative of an alignment of the handheld controller.
    Type: Application
    Filed: January 4, 2019
    Publication date: May 9, 2019
    Inventors: David Dearman, Chun Yat Frank Li, Erica Morse
  • Patent number: 10275023
    Abstract: In a virtual reality system, an optical tracking device may detect and track a user's eye gaze direction and/or movement, and/or sensors may detect and track a user's head gaze direction and/or movement, relative to virtual user interfaces displayed in a virtual environment. A processor may process the detected gaze direction and/or movement as a user input, and may translate the user input into a corresponding interaction in the virtual environment. Gaze directed swipes on a virtual keyboard displayed in the virtual environment may be detected and tracked, and translated into a corresponding text input, either alone or together with user input(s) received by the controller. The user may also interact with other types of virtual interfaces in the virtual environment using gaze direction and movement to provide an input, either alone or together with a controller input.
    Type: Grant
    Filed: December 21, 2016
    Date of Patent: April 30, 2019
    Assignee: GOOGLE LLC
    Inventors: Chris McKenzie, Chun Yat Frank Li, Hayes S. Raffle
  • Patent number: 10198874
    Abstract: Systems, devices, methods, computer program products, and electronic apparatuses for aligning components in virtual reality environments are provided. An example method includes detecting a first input from a handheld controller of a virtual reality system, responsive to detecting the first input, instructing a user to orient a handheld controller in a designated direction, detecting a second input from the handheld controller; and responsive to detecting the second input, storing alignment data representative of an alignment of the handheld controller.
    Type: Grant
    Filed: May 12, 2017
    Date of Patent: February 5, 2019
    Assignee: Google LLC
    Inventors: David Dearman, Chun Yat Frank Li, Erica Morse
  • Publication number: 20170336882
    Abstract: A controller configured to control a pointer in a virtual reality environment includes a multi-axis magnetic field sensor, a multi-axis accelerometer, a gyroscope, a touchpad, and a wireless communications circuit. The controller can also include a processor and a memory storing instructions that when executed by the processor, cause the processor to obtain geomagnetic field data from the multi-axis magnetic field sensor, obtain acceleration data describing a direction and a magnitude of force affecting the controller from the multi-axis accelerometer, and obtain angular velocity data describing a rotational position of the controller from the gyroscope. The processor can communicate movement data to a computing device configured to generate a rendering of the virtual reality environment, the movement data describing an orientation of the controller wherein the movement data is based on at least one of the geomagnetic field data, the acceleration data, or the angular velocity data.
    Type: Application
    Filed: May 17, 2017
    Publication date: November 23, 2017
    Inventors: Basheer TOME, Hayes S. RAFFLE, Chun Yat Frank LI
  • Publication number: 20170336915
    Abstract: A method for aligning an image on a mobile device disposed within a head-mounted display (HMD) housing includes: detecting a request to align an image on a touchscreen of a mobile device; detecting, on the touchscreen, a first detected location corresponding to a first touchscreen input event; determining a first displacement of the first detected location with respect to a first target location of the first touchscreen input event; and transposing the image on the touchscreen based on the first displacement. A virtual reality system includes: a mobile device having a touchscreen configured to display an image; and a HMD housing having a first contact configured to generate a first input event at a first location on the touchscreen when the mobile device is disposed within the HMD housing.
    Type: Application
    Filed: May 3, 2017
    Publication date: November 23, 2017
    Inventors: Chun Yat Frank Li, Hayes S. Raffle, Eric Allan MacIntosh
  • Publication number: 20170330387
    Abstract: Systems, devices, methods, computer program products, and electronic apparatuses for aligning components in virtual reality environments are provided. An example method includes detecting a first input from a handheld controller of a virtual reality system, responsive to detecting the first input, instructing a user to orient a handheld controller in a designated direction, detecting a second input from the handheld controller; and responsive to detecting the second input, storing alignment data representative of an alignment of the handheld controller.
    Type: Application
    Filed: May 12, 2017
    Publication date: November 16, 2017
    Inventors: David Dearman, Chun Yat Frank Li, Erica Morse
  • Publication number: 20170329419
    Abstract: A system for combining a gyromouse input with a touch surface input in an augmented reality (AR) environment and/or a virtual reality (VR) environment, a virtual display of virtual items and/or features may be adjusted in response to movement of the gyromouse combined with touch inputs, or touch and drag inputs, received on a touch surface of the gyromouse. Use of the gyromouse in the AR/VR environment may allow touch screen capabilities to be accurately projected into a three dimensional virtual space, providing a controller having improved functionality and utility in the AR/VR environment, and enhancing the user's experience.
    Type: Application
    Filed: December 15, 2016
    Publication date: November 16, 2017
    Inventors: David Dearman, Chun Yat Frank Li
  • Publication number: 20170322623
    Abstract: In a virtual reality system, an optical tracking device may detect and track a user's eye gaze direction and/or movement, and/or sensors may detect and track a user's head gaze direction and/or movement, relative to virtual user interfaces displayed in a virtual environment. A processor may process the detected gaze direction and/or movement as a user input, and may translate the user input into a corresponding interaction in the virtual environment. Gaze directed swipes on a virtual keyboard displayed in the virtual environment may be detected and tracked, and translated into a corresponding text input, either alone or together with user input(s) received by the controller. The user may also interact with other types of virtual interfaces in the virtual environment using gaze direction and movement to provide an input, either alone or together with a controller input.
    Type: Application
    Filed: December 21, 2016
    Publication date: November 9, 2017
    Inventors: Chris McKenzie, Chun Yat Frank Li, Hayes S. Raffle
  • Patent number: 9804682
    Abstract: Embodiments described herein may provide a configuration of input interfaces used to perform multi-touch operations. An example device may involve: (a) a housing arranged on a head-mountable device, (b) a first input interface arranged on either a superior or an inferior surface of the housing, (c) a second input interface arranged on a surface of the housing that is opposite to the first input interface, and (d) a control system configured to: (1) receive first input data from the first input interface, where the first input data corresponds to a first input action, and in response, cause a camera to perform a first operation in accordance with the first input action, and (2) receive second input data from the second input interface, where the second input data corresponds to a second input action(s) on the second input interface, and in response, cause the camera to perform a second operation.
    Type: Grant
    Filed: February 1, 2016
    Date of Patent: October 31, 2017
    Assignee: Google Inc.
    Inventors: Chun Yat Frank Li, Hayes Solos Raffle
  • Patent number: 9798517
    Abstract: Embodiments may relate to intuitive user-interface features for a head-mountable device (HMD), in the context of a hybrid human and computer-automated response system. An illustrative method may involve a head-mountable device (HMD) that comprises a touchpad: (a) sending a speech-segment message to a hybrid response system, wherein the speech-segment message is indicative of a speech segment that is detected in audio data captured at the HMD, and wherein the speech-segment is associated with a first user-account with the hybrid response system, (b) receiving a response message that includes a response to the speech-segment message and an indication of a next action corresponding to the response to the speech-segment message, (c) displaying a screen interface that includes an indication of the response, and (d) while displaying the response, detecting a singular touch gesture and responsively initiating the at least one next action.
    Type: Grant
    Filed: January 27, 2017
    Date of Patent: October 24, 2017
    Assignee: X Development LLC
    Inventors: Chun Yat Frank Li, Daniel Rodriguez Magana, Thiago Teixeira, Charles Chen, Anand Agarawala
  • Publication number: 20170139672
    Abstract: Embodiments may relate to intuitive user-interface features for a head-mountable device (HMD), in the context of a hybrid human and computer-automated response system. An illustrative method may involve a head-mountable device (HMD) that comprises a touchpad: (a) sending a speech-segment message to a hybrid response system, wherein the speech-segment message is indicative of a speech segment that is detected in audio data captured at the HMD, and wherein the speech-segment is associated with a first user-account with the hybrid response system, (b) receiving a response message that includes a response to the speech-segment message and an indication of a next action corresponding to the response to the speech-segment message, (c) displaying a screen interface that includes an indication of the response, and (d) while displaying the response, detecting a singular touch gesture and responsively initiating the at least one next action.
    Type: Application
    Filed: January 27, 2017
    Publication date: May 18, 2017
    Inventors: Chun Yat Frank Li, Daniel Rodriguez Magana, Thiago Teixeira, Charles Chen, Anand Agarawala
  • Publication number: 20170139567
    Abstract: Embodiments described herein may help to provide a lock-screen for a computing device. An example method involves: (a) displaying two or more rows of characters and an input region that is moveable over the rows of characters, (b) based on head-movement data, determining movement of the input region with respect to the rows of characters, (c) determining an input sequence, where the sequence includes one character from each of the rows of characters that is selected based at least in part on the one or more movements of the input region with respect to the rows of characters, (d) determining whether or not the input sequence matches a predetermined unlock sequence, and (e) if the input sequence matches the predetermined unlock sequence, then unlocking the computing device.
    Type: Application
    Filed: July 3, 2013
    Publication date: May 18, 2017
    Inventor: Chun Yat Frank Li