Patents by Inventor Chengyuan Yan

Chengyuan Yan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250123679
    Abstract: In one embodiment, a method includes capturing, using one or more cameras implemented in a wearable device worn by a user, a first image depicting at least a part of a hand of the user holding a controller in an environment, identifying one or more features from the first image to estimate a pose of the hand of the user, estimating a first pose of the controller based on the pose of the hand of the user and an estimated grip that defines a relative pose between the hand of the user and the controller, receiving IMU data of the controller, and estimating a second pose of the controller by updating the first pose of the controller using the IMU data of the controller. The method utilizes multiple data sources to track the controller under various conditions of the environment to provide an accurate controller tracking consistently.
    Type: Application
    Filed: October 18, 2024
    Publication date: April 17, 2025
    Inventors: Tsz Ho Yu, Chengyuan Yan, Christian Forster
  • Publication number: 20250117091
    Abstract: A method of interpreting a digit-to-digit gesture based on roll values for a wrist-wearable device is disclosed. The method includes, receiving an indication at a first point in time that a user donning the wrist-wearable device is providing a digit-to-digit gesture in which one of the user's digits touches another of the user's digits without contacting the display of the wearable device. The method further includes, in accordance with a determination that the digit-to-digit gesture is provided while data indicates that the wrist-wearable device has a first roll value, causing a target device in communication with the wearable device to perform a first input command and receiving another indication at a second point in time that is after the first point in time that the user is providing the digit-to-digit gesture again, causing the target device to perform a second input command that is distinct from the first input command.
    Type: Application
    Filed: September 17, 2024
    Publication date: April 10, 2025
    Inventors: Chengyuan Yan, Yinglin Li, Qian Chen
  • Patent number: 12158992
    Abstract: Methods of interpreting in-air hand gestures based on orientations of a wrist-wearable device are provided. The method includes receiving, from sensors of a wrist-wearable device, data associated with performance of an in-air hand gesture during a first period of time. The method includes, when determining the data indicates that the wrist-wearable device had a first orientation when the in-air hand gesture was performed during the first period of time, causing performance of a first operation. The method includes receiving, from sensors of the wrist-wearable device, new data associated with performance of the in-air hand gesture during a second period of time, the second period of time being after the first period of time. The method includes, when determining that the new data indicates that the wrist-wearable device had a second orientation when the in-air hand gesture was performed during the second period of time, causing performance of a second operation.
    Type: Grant
    Filed: May 4, 2023
    Date of Patent: December 3, 2024
    Assignee: META PLATFORMS TECHNOLOGIES, LLC
    Inventors: Chengyuan Yan, Willy Huang, Joseph Davis Greer, Sheng Shen, Szeyin Lee
  • Patent number: 12153724
    Abstract: In one embodiment, a method includes capturing, using one or more cameras implemented in a wearable device worn by a user, a first image depicting at least a part of a hand of the user holding a controller in an environment, identifying one or more features from the first image to estimate a pose of the hand of the user, estimating a first pose of the controller based on the pose of the hand of the user and an estimated grip that defines a relative pose between the hand of the user and the controller, receiving IMU data of the controller, and estimating a second pose of the controller by updating the first pose of the controller using the IMU data of the controller. The method utilizes multiple data sources to track the controller under various conditions of the environment to provide an accurate controller tracking consistently.
    Type: Grant
    Filed: April 13, 2022
    Date of Patent: November 26, 2024
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Tsz Ho Yu, Chengyuan Yan, Christian Forster
  • Publication number: 20240370099
    Abstract: Example implementations are for tracking an artificial reality input device by receiving, for the input device, video tracking data and inertial motion unit (“IMU”) data based on motion input. Example implementations generate, from the video tracking data, a video tracking position and a video tracking velocity; generate, from the IMU data, an IMU orientation and an IMU linear acceleration; and generate, from the IMU orientation and the IMU linear acceleration, an IMU linear velocity. Example implementations determine if the video tracking position and the video tracking velocity is reliable and determine, by a Kalman filter for the input device, a current bias, a current velocity and a current position.
    Type: Application
    Filed: March 27, 2024
    Publication date: November 7, 2024
    Inventors: Chengyuan YAN, Sheng SHEN
  • Patent number: 12093464
    Abstract: A method of interpreting a digit-to-digit gesture based on roll values for a wrist-wearable device is disclosed. The method includes, receiving an indication at a first point in time that a user donning the wrist-wearable device is providing a digit-to-digit gesture in which one of the user's digits touches another of the user's digits without contacting the display of the wearable device. The method further includes, in accordance with a determination that the digit-to-digit gesture is provided while data indicates that the wrist-wearable device has a first roll value, causing a target device in communication with the wearable device to perform a first input command and receiving another indication at a second point in time that is after the first point in time that the user is providing the digit-to-digit gesture again, causing the target device to perform a second input command that is distinct from the first input command.
    Type: Grant
    Filed: August 31, 2022
    Date of Patent: September 17, 2024
    Assignee: META PLATFORMS TECHNOLOGIES, LLC
    Inventors: Yinglin Li, Qian Chen, Chengyuan Yan
  • Publication number: 20240296289
    Abstract: In one embodiment, a method includes rendering a first output image comprising one or more augmented-reality (AR) objects for displays of an AR rendering device of an AR system associated with a first user. The method further includes accessing sensor signals associated with the first user. The one or more sensor signals may be captured by sensors of the AR system. The method further includes detecting a change in a context of the first user with respect to a real-world environment based on the sensor signals. The method further includes rendering a second output image comprising the AR objects for the displays of the AR rendering device. One or more of the AR objects may be adapted based on the detected change in the context of the first user.
    Type: Application
    Filed: March 6, 2024
    Publication date: September 5, 2024
    Inventors: Yiming Pu, Christopher E. Balmes, Gabrielle Catherine Moskey, John Jacob Blakeley, Amy Lawson Bearman, Alireza Dirafzoon, Matthew Dan Feiszli, Ganesh Venkatesh, Babak Damavandi, Jiwen Ren, Chengyuan Yan, Guangqiang Dong
  • Publication number: 20240168567
    Abstract: The various implementations described herein include methods and systems for power-efficient processing of neuromuscular signals. In one aspect, a method includes: (i) obtaining a first set of neuromuscular signals; (ii) after determining, using a low-power detector, that the first set of neuromuscular signals require further processing to confirm that a predetermined in-air hand gesture has been performed: (a) processing the first set of neuromuscular signals using a high-power detector; and (b) in accordance with a determination that the processing indicates that the predetermined in-air hand gesture did occur, registering an occurrence of the predetermined in-air hand gesture; (iii) receiving a second set of neuromuscular signals; and (iv) after determining, using the low-power detector and not using the high-power detector, that a different predetermined in-air hand gesture was performed, performing an action in response to the different predetermined in-air hand gesture.
    Type: Application
    Filed: September 19, 2023
    Publication date: May 23, 2024
    Inventors: Alexandre Barachant, Bijan Treister, Shan Chu, Igor Gurovski, Chetan Parag Gupta, Tahir Turan Caliskan, Pascal Alexander Bentioulis, Viswanath Sivakumar, Zhong Zhang, Ramzi Elkhater, Maciej Lazarewicz, Per-Erik Bergstrom, Peter Andrew Matsimanis, Chengyuan Yan
  • Patent number: 11966701
    Abstract: In one embodiment, a method includes rendering a first output image comprising one or more augmented-reality (AR) objects for displays of an AR rendering device of an AR system associated with a first user. The method further includes accessing sensor signals associated with the first user. The one or more sensor signals may be captured by sensors of the AR system. The method further includes detecting a change in a context of the first user with respect to a real-world environment based on the sensor signals. The method further includes rendering a second output image comprising the AR objects for the displays of the AR rendering device. One or more of the AR objects may be adapted based on the detected change in the context of the first user.
    Type: Grant
    Filed: August 2, 2021
    Date of Patent: April 23, 2024
    Assignee: Meta Platforms, Inc.
    Inventors: Yiming Pu, Christopher E Balmes, Gabrielle Catherine Moskey, John Jacob Blakeley, Amy Lawson Bearman, Alireza Dirafzoon, Matthew Dan Feiszli, Ganesh Venkatesh, Babak Damavandi, Jiwen Ren, Chengyuan Yan, Guangqiang Dong
  • Publication number: 20240103626
    Abstract: The disclosed apparatus may include a wearable haptic ring that features input capabilities relative to a computing system. In various examples, the wearable haptic ring may be designed to curve around a human finger of a wearer with a touchpad that is seamlessly integrated with the ring. For example, the seamlessly integrated touchpad may be operable by another finger of the wearer. Moreover, the haptic ring may include a haptic feedback unit designed to provide haptic feedback in response to input from the wearer. As such, the haptic ring may enable a wide range of user inputs while appearing like a typical ring rather than a computer input/output device. Various other implementations are also disclosed.
    Type: Application
    Filed: September 20, 2023
    Publication date: March 28, 2024
    Inventors: Dan Kun-yi Chen, Chengyuan Yan
  • Publication number: 20230281938
    Abstract: The various implementations described herein include methods and systems for providing input capabilities at various fidelity levels. In one aspect, a method includes receiving, from an application, a request identifying an input capability for making an input operation available within the application. The method further includes, in response to receiving the request: identifying techniques that the artificial-reality system can use to make the requested input capability available to the application using data from one or more devices; selecting a first technique for making the requested input capability available to the application; and using the first technique to provide to the application data to allow for performance of the requested input capability.
    Type: Application
    Filed: February 27, 2023
    Publication date: September 7, 2023
    Inventors: Chengyuan YAN, Joseph Davis GREER, Sheng SHEN, Anurag SHARMA
  • Patent number: 11670082
    Abstract: Systems and methods for providing a virtual space for multiple devices can include a first device having at least one sensor configured to acquire a spatial information of a physical space of the first device. The first device may include at least one processor configured to establish, according to the acquired spatial information, a virtual space corresponding to the physical space, that is accessible by a user of the first device via the first device. The at least one processor may further be configured to register a second device within the physical space, to allow a user of the second device to access the virtual space via the second device.
    Type: Grant
    Filed: December 6, 2021
    Date of Patent: June 6, 2023
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Chengyuan Yan, Amrutha Hakkare Arunachala, Chengyuan Lin, Anush Mohan, Ke Huo
  • Publication number: 20230076068
    Abstract: A method of interpreting a digit-to-digit gesture based on roll values for a wrist-wearable device is disclosed. The method includes, receiving an indication at a first point in time that a user donning the wrist-wearable device is providing a digit-to-digit gesture in which one of the user's digits touches another of the user's digits without contacting the display of the wearable device. The method further includes, in accordance with a determination that the digit-to-digit gesture is provided while data indicates that the wrist-wearable device has a first roll value, causing a target device in communication with the wearable device to perform a first input command and receiving another indication at a second point in time that is after the first point in time that the user is providing the digit-to-digit gesture again, causing the target device to perform a second input command that is distinct from the first input command.
    Type: Application
    Filed: August 31, 2022
    Publication date: March 9, 2023
    Inventors: Yinglin Li, Qian Chen, Chengyuan Yan
  • Patent number: 11521356
    Abstract: Systems and methods for maintaining a shared interactive environment include receiving, by a server, requests to register a first input device of a first user and a second input device of a second user with a shared interactive environment. The first input device may be for a first modality involving user input for an augmented reality (AR) environment, and the second input device may be for a second modality involving user input for a personal computer (PC) based virtual environment or a virtual reality (VR) environment. The server may register the first and second input device with the shared interactive environment. The server may receive inputs from a first adapter for the first modality and from a second adapter for the second modality. The inputs may be for the first and second user to use the shared interactive environment.
    Type: Grant
    Filed: October 10, 2019
    Date of Patent: December 6, 2022
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Chengyuan Yan, Ke Huo, Amrutha Hakkare Arunachala, Chengyuan Lin, Anush Mohan
  • Publication number: 20220374130
    Abstract: In one embodiment, a method includes rendering a first output image comprising one or more augmented-reality (AR) objects for displays of an AR rendering device of an AR system associated with a first user. The method further includes accessing sensor signals associated with the first user. The one or more sensor signals may be captured by sensors of the AR system. The method further includes detecting a change in a context of the first user with respect to a real-world environment based on the sensor signals. The method further includes rendering a second output image comprising the AR objects for the displays of the AR rendering device. One or more of the AR objects may be adapted based on the detected change in the context of the first user.
    Type: Application
    Filed: August 2, 2021
    Publication date: November 24, 2022
    Inventors: Yiming Pu, Christopher E. Balmes, Gabrielle Catherine Moskey, John Jacob Blakeley, Amy Lawson Bearman, Alireza Dirafzoon, Matthew Dan Feiszli, Ganesh Venkatesh, Babak Damavandi, Jiwen Ren, Chengyuan Yan, Guangqiang Dong
  • Patent number: 11500413
    Abstract: In one embodiment, a method includes receiving, from a controller, a data packet including (1) a plurality of samples each corresponding to measurements from a motion sensor and (2) a timestamp corresponding to a measurement time of one of the samples as measured by a clock of the controller; determining, based on the timestamp, an estimated measurement time relative to a local clock for each of the plurality of samples that is not associated with the timestamp; and converting each of the timestamp and the estimated measurement times to a corresponding synchronization time using a learned relationship relating the clock of the controller and the local clock. The learned relationship is iteratively learned based on previously received data packets from the controller. The synchronization time associated with each of the plurality of samples represents an estimated time, relative to the local clock, at which the sample was measured.
    Type: Grant
    Filed: August 31, 2020
    Date of Patent: November 15, 2022
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Boyang Zhang, Adrian Brian Ratter, Chengyuan Yan, Jack Hood Profit, Jr., Paul Austin Buckley
  • Publication number: 20220253131
    Abstract: In one embodiment, a method includes capturing, using one or more cameras implemented in a wearable device worn by a user, a first image depicting at least a part of a hand of the user holding a controller in an environment, identifying one or more features from the first image to estimate a pose of the hand of the user, estimating a first pose of the controller based on the pose of the hand of the user and an estimated grip that defines a relative pose between the hand of the user and the controller, receiving IMU data of the controller, and estimating a second pose of the controller by updating the first pose of the controller using the IMU data of the controller. The method utilizes multiple data sources to track the controller under various conditions of the environment to provide an accurate controller tracking consistently.
    Type: Application
    Filed: April 13, 2022
    Publication date: August 11, 2022
    Inventors: Tsz Ho Yu, Chengyuan Yan, Christian Forster
  • Patent number: 11320896
    Abstract: In one embodiment, a method includes capturing, using one or more cameras implemented in a wearable device worn by a user, a first image depicting at least a part of a hand of the user holding a controller in an environment, identifying one or more features from the first image to estimate a pose of the hand of the user, estimating a first pose of the controller based on the pose of the hand of the user and an estimated grip that defines a relative pose between the hand of the user and the controller, receiving IMU data of the controller, and estimating a second pose of the controller by updating the first pose of the controller using the IMU data of the controller. The method utilizes multiple data sources to track the controller under various conditions of the environment to provide an accurate controller tracking consistently.
    Type: Grant
    Filed: August 3, 2020
    Date of Patent: May 3, 2022
    Assignee: Facebook Technologies, LLC.
    Inventors: Tsz Ho Yu, Chengyuan Yan, Christian Forster
  • Publication number: 20220035441
    Abstract: In one embodiment, a method includes capturing, using one or more cameras implemented in a wearable device worn by a user, a first image depicting at least a part of a hand of the user holding a controller in an environment, identifying one or more features from the first image to estimate a pose of the hand of the user, estimating a first pose of the controller based on the pose of the hand of the user and an estimated grip that defines a relative pose between the hand of the user and the controller, receiving IMU data of the controller, and estimating a second pose of the controller by updating the first pose of the controller using the IMU data of the controller. The method utilizes multiple data sources to track the controller under various conditions of the environment to provide an accurate controller tracking consistently.
    Type: Application
    Filed: August 3, 2020
    Publication date: February 3, 2022
    Inventors: Tsz Ho Yu, Chengyuan Yan, Christian Forster
  • Patent number: 11195020
    Abstract: Systems and methods for providing a virtual space for multiple devices can include a first device having at least one sensor configured to acquire a spatial information of a physical space of the first device. The first device may include at least one processor configured to establish, according to the acquired spatial information, a virtual space corresponding to the physical space, that is accessible by a user of the first device via the first device. The at least one processor may further be configured to register a second device within the physical space, to allow a user of the second device to access the virtual space via the second device.
    Type: Grant
    Filed: October 29, 2019
    Date of Patent: December 7, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Chengyuan Yan, Amrutha Hakkare Arunachala, Chengyuan Lin, Anush Mohan, Ke Huo