Patents by Inventor Richard Zhuang

Richard Zhuang has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240137436
    Abstract: A case for a portable device like a smartphone includes light sources such as LEDs, which, when illuminated, can be detected and tracked by a head-worn augmented or virtual reality device. The light sources may be located at the corners of the case and may emit infrared light. A relative pose between the smartphone and the head-worn device can be determined based on computer vision techniques performed on images captured by the head-worn device that includes light from the light sources. Relative movement between the smartphone and the head-worn device can be used to provide user input to the head-worn device, as can touch input on the portable device. In some instances, the case is powered inductively from the portable device.
    Type: Application
    Filed: October 19, 2022
    Publication date: April 25, 2024
    Inventors: Ilteris Kaan Canberk, Matthew Hallberg, Richard Zhuang
  • Publication number: 20240112429
    Abstract: A method of providing an interactive personal mobility system, performed by one or more processors, comprises determining an initial pose by visual-inertial odometry performed on images and inertial measurement unit (IMU) data generated by a wearable augmented reality device. Sensor data transmitted from a personal mobility system is received, and sensor fusion is performed on the data received from the personal mobility system to provide an updated pose. Augmented reality effects are displayed on the wearable augmented reality device based on the updated pose.
    Type: Application
    Filed: December 12, 2023
    Publication date: April 4, 2024
    Inventors: Edmund Graves Brown, Benjamin Lucas, Jonathan M. Rodriguez, II, Richard Zhuang
  • Patent number: 11924775
    Abstract: A power control method includes obtaining, by a first terminal, parameter information required for power control, where the parameter information includes at least one of a first parameter, a second parameter, and a third parameter, and determining, by the first terminal based on the parameter information, uplink transmit power used when uplink transmission is performed on a target beam or a target beam pair; where the first parameter includes a beam reception gain of a network device and/or a beam sending gain of the first terminal, where the second parameter is used to indicate interference caused by a second terminal to the first terminal on the target beam, and where the third parameter includes beam-specific target power and/or terminal-specific target power.
    Type: Grant
    Filed: December 20, 2021
    Date of Patent: March 5, 2024
    Assignee: HUAWEI TECHNOLOGIES CO., LTD.
    Inventors: Lili Zhang, Guorong Li, Hongcheng Zhuang, Richard Stirling-Gallacher
  • Patent number: 11900550
    Abstract: A method of providing an interactive personal mobility system, performed by one or more processors, comprises determining an initial pose by visual-inertial odometry performed on images and inertial measurement unit (IMU) data generated by a wearable augmented reality device. Sensor data transmitted from a personal mobility system is received, and sensor fusion is performed on the data received from the personal mobility system to provide an updated pose. Augmented reality effects are displayed on the wearable augmented reality device based on the updated pose.
    Type: Grant
    Filed: December 23, 2021
    Date of Patent: February 13, 2024
    Assignee: SNAP INC.
    Inventors: Edmund Graves Brown, Benjamin Lucas, Jonathan M. Rodriguez, II, Richard Zhuang
  • Publication number: 20240032121
    Abstract: A communication link is established between a first mobile device and a second mobile device using communication setup information in a machine-readable code that is displayed on a display of the second mobile device. The first mobile device captures and decodes an image of the machine-readable code to extract dynamically-generated communication setup information. A communication link is then established between the two devices using the communication setup information. The machine readable code may also be used as a fiducial marker to establish an initial relative pose between the two devices. Pose updates received from the second mobile device can then be used as user-interface inputs to the first mobile device.
    Type: Application
    Filed: July 20, 2022
    Publication date: January 25, 2024
    Inventors: Richard Zhuang, Matthew Hallberg
  • Publication number: 20230415044
    Abstract: AR-enhanced gameplay includes a map of a course including a plurality of virtual objects, the map corresponding to a location in the real world and defining a track along which participants can ride on personal mobility systems such as scooters. Virtual objects are displayed in the fields of view of participants' augmented reality devices in a positions corresponding to positions in the real world on the course. Proximity of a participant or their personal mobility system with the position of a virtual object in the real world is detected, and in response to the detection of proximity, a performance characteristic of the participant's personal mobility system is modified.
    Type: Application
    Filed: September 14, 2023
    Publication date: December 28, 2023
    Inventors: Edmund Graves Brown, Benjamin Lucas, Jonathan M. Rodriguez, II, Richard Zhuang
  • Patent number: 11813528
    Abstract: AR-enhanced gameplay includes a map of a course including a plurality of virtual objects, the map corresponding to a location in the real world and defining a track along which participants can ride on personal mobility systems such as scooters. Virtual objects are displayed in the fields of view of participants' augmented reality devices in a positions corresponding to positions in the real world on the course. Proximity of a participant or their personal mobility system with the position of a virtual object in the real world is detected, and in response to the detection of proximity, a performance characteristic of the participant's personal mobility system is modified.
    Type: Grant
    Filed: November 1, 2021
    Date of Patent: November 14, 2023
    Assignee: Snap Inc.
    Inventors: Edmund Graves Brown, Benjamin Lucas, Jonathan M. Rodriguez, II, Richard Zhuang
  • Publication number: 20230306690
    Abstract: Content is displayed to a user of augmented reality device. In response to receiving an indication of an increased level of risk, the degree of content being displayed to the user is reduced. The indication of increased level of risk may be generated by or received from an associated transportation device. The adjustment of the display of the degree of content may include moving one or more content elements out of a central field of view of the augmented reality device, reducing the size or visual characteristics of a content element, or eliminating a content element from the display.
    Type: Application
    Filed: March 22, 2022
    Publication date: September 28, 2023
    Inventors: Edmund Graves Brown, Benjamin Lucas, Jonathan M. Rodriguez, II, Richard Zhuang
  • Publication number: 20230214639
    Abstract: Techniques for training a neural network having a plurality of computational layers with associated weights and activations for computational layers in fixed-point formats include determining an optimal fractional length for weights and activations for the computational layers; training a learned clipping-level with fixed-point quantization using a PACT process for the computational layers; and quantizing on effective weights that fuses a weight of a convolution layer with a weight and running variance from a batch normalization layer. A fractional length for weights of the computational layers is determined from current values of weights using the determined optimal fractional length for the weights of the computational layers. A fixed-point activation between adjacent computational layers is related using PACT quantization of the clipping-level and an activation fractional length from a node in a following computational layer.
    Type: Application
    Filed: December 31, 2021
    Publication date: July 6, 2023
    Inventors: Sumant Milind Hanumante, Qing Jin, Sergei Korolev, Denys Makoviichuk, Jian Ren, Dhritiman Sagar, Patrick Timothy McSweeney Simons, Sergey Tulyakov, Yang Wen, Richard Zhuang
  • Publication number: 20230215106
    Abstract: A method of locating a personal mobility system using an augmented reality device is disclosed. The method comprises receiving positional data corresponding to a location of a personal mobility system, determining a relative position between the augmented reality device and the location of the personal mobility system, and causing the display of an augmented reality effect by the augmented reality device based on the relative position between the augmented reality device and the location of the personal mobility system.
    Type: Application
    Filed: January 25, 2022
    Publication date: July 6, 2023
    Inventors: Edmund Graves Brown, Benjamin Lucas, Jonathan M. Rodriguez, II, Richard Zhuang
  • Publication number: 20230139739
    Abstract: AR-enhanced gameplay includes a map of a course including a plurality of virtual objects, the map corresponding to a location in the real world and defining a track along which participants can ride on personal mobility systems such as scooters. Virtual objects are displayed in the fields of view of participants' augmented reality devices in a positions corresponding to positions in the real world on the course. Proximity of a participant or their personal mobility system with the position of a virtual object in the real world is detected, and in response to the detection of proximity, a performance characteristic of the participant's personal mobility system is modified.
    Type: Application
    Filed: November 1, 2021
    Publication date: May 4, 2023
    Inventors: Edmund Graves Brown, Benjamin Lucas, Jonathan M. Rodriguez, II, Richard Zhuang
  • Publication number: 20230113076
    Abstract: An electronic eyewear device includes first and second systems-on-chip (SoCs) having independent time bases. The first and second SoCs are connected by a shared general purpose input/output (GPIO) connection and an inter-SoC interface. The first and second SoCs are synchronized to each other by the first SoC asserting the shared GPIO connection to the second SoC where assertion of the message to the shared GPIO connection triggers an interrupt request (IRQ) at the second SoC. The first SoC records a first timestamp for assertion of the message to the GPIO connection, and the second SoC records a second timestamp of receipt of the IRQ. The first SoC sends a message including the first timestamp to the second SoC over the inter-SoC interface. The second SoC calculates a clock offset between the first and second SoCs as a difference between the first and second timestamps.
    Type: Application
    Filed: October 7, 2021
    Publication date: April 13, 2023
    Inventors: Samuel Ahn, Dmitry Ryuma, Richard Zhuang
  • Publication number: 20230105428
    Abstract: A method of providing an interactive personal mobility system, performed by one or more processors, comprises determining an initial pose by visual-inertial odometry performed on images and inertial measurement unit (IMU) data generated by a wearable augmented reality device. Sensor data transmitted from a personal mobility system is received, and sensor fusion is performed on the data received from the personal mobility system to provide an updated pose. Augmented reality effects are displayed on the wearable augmented reality device based on the updated pose.
    Type: Application
    Filed: December 23, 2021
    Publication date: April 6, 2023
    Inventors: Edmund Graves Brown, Benjamin Lucas, Jonathan M. Rodriguez, II, Richard Zhuang
  • Publication number: 20230098451
    Abstract: A method of controlling a personal mobility system includes displaying a virtual object on an augmented reality wearable device, the virtual object being located in a position in the field of view of the augmented reality device corresponding to a position in the real world. Proximity of the personal mobility system or a user of the personal mobility system with the position in the real world is detected. In response to the detection of proximity, a performance characteristic of the personal mobility system is modified.
    Type: Application
    Filed: September 30, 2021
    Publication date: March 30, 2023
    Inventors: Edmund Graves Brown, Benjamin Lucas, Jonathan M. Rodriguez, II, Richard Zhuang
  • Patent number: 11563886
    Abstract: Systems, devices, media, and methods are described for capturing a series of raw images by portable electronic devices, such as wearable devices including eyewear, and automating the process of processing such raw images by a client mobile device, such as a smart phone, such automation including the process of uploading to a network and directing to a target audience. In some implementations, a user selects profile settings on the client device before capturing images on the companion device, so that when the companion device has captured the images, the system follows the profile settings upon automatically processing the images captured by the companion device.
    Type: Grant
    Filed: January 31, 2022
    Date of Patent: January 24, 2023
    Assignee: Snap Inc.
    Inventors: Andrew Bartow, Matthew Hanover, Richard Zhuang
  • Publication number: 20230007227
    Abstract: Eyewear providing an interactive augmented reality experience to users in a first physical environment viewing objects in a second physical environment (e.g., X-ray effect). The second environment may be a room positioned behind a barrier, such as a wall. The user views the second environment via a sensor system moveable on the wall using a track system. As the user in the first environment moves the eyewear to face the outside surface of the wall along a line-of-sight (LOS) at a location (x, y, z), the sensor system on the track system repositions to the same location (x, y, z) on the inside surface of wall. The image captured by the sensor system in the second environment is wirelessly transmitted to the eyewear for displayed on the eyewear displays, providing the user with an X-ray effect of looking through the wall to see the objects within the other environment.
    Type: Application
    Filed: June 14, 2022
    Publication date: January 5, 2023
    Inventors: Edmund Brown, Benjamin Lucas, Simon Nielsen, Jonathan M. Rodriguez, II, Richard Zhuang
  • Publication number: 20220252894
    Abstract: Systems, devices, media, and methods are described for capturing a series of video clips, together with position, orientation, and motion data collected from an inertial measurement unit during filming. The methods in some examples include calculating camera orientations based on the data collected, computing a stabilized output path based on the camera orientations, and then combining the video segments in accordance with said stabilized output path to produce a video composition that is stable, short, and easy to share. The video clips are filmed in accordance with a set of conditions called a capture profile. In some implementations, the capture profile conditions are reactive, adjusting in real time, during filming, in response to sensor data gathered in real time from a sensor array.
    Type: Application
    Filed: April 28, 2022
    Publication date: August 11, 2022
    Inventors: Matthew Hanover, Richard Zhuang
  • Publication number: 20220159178
    Abstract: Systems, devices, media, and methods are described for capturing a series of raw images by portable electronic devices, such as wearable devices including eyewear, and automating the process of processing such raw images by a client mobile device, such as a smart phone, such automation including the process of uploading to a network and directing to a target audience. In some implementations, a user selects profile settings on the client device before capturing images on the companion device, so that when the companion device has captured the images, the system follows the profile settings upon automatically processing the images captured by the companion device.
    Type: Application
    Filed: January 31, 2022
    Publication date: May 19, 2022
    Inventors: Andrew Bartow, Matthew Hanover, Richard Zhuang
  • Patent number: 11320667
    Abstract: Systems, devices, media, and methods are described for capturing a series of video clips, together with position, orientation, and motion data collected from an inertial measurement unit during filming. The methods in some examples include calculating camera orientations based on the data collected, computing a stabilized output path based on the camera orientations, and then combining the video segments in accordance with said stabilized output path to produce a video composition that is stable, short, and easy to share. The video clips are filmed in accordance with a set of conditions called a capture profile. In some implementations, the capture profile conditions are reactive, adjusting in real time, during filming, in response to sensor data gathered in real time from a sensor array.
    Type: Grant
    Filed: September 27, 2019
    Date of Patent: May 3, 2022
    Assignee: Snap Inc.
    Inventors: Matthew Hanover, Richard Zhuang
  • Patent number: 11297224
    Abstract: Systems, devices, media, and methods are described for capturing a series of raw images by portable electronic devices, such as wearable devices including eyewear, and automating the process of processing such raw images by a client mobile device, such as a smart phone, such automation including the process of uploading to a network and directing to a target audience. In some implementations, a user selects profile settings on the client device before capturing images on the companion device, so that when the companion device has captured the images, the system follows the profile settings upon automatically processing the images captured by the companion device.
    Type: Grant
    Filed: September 30, 2019
    Date of Patent: April 5, 2022
    Assignee: Snap Inc.
    Inventors: Andrew Bartow, Matthew Hanover, Richard Zhuang