Patents by Inventor Chaowei MA

Chaowei MA has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240078423
    Abstract: A vision transformer (ViT) is a deep learning model that performs one or more vision processing tasks. ViTs may be modified to include a global task that clusters images with the same concept together to produce semantically consistent relational representations, as well as a local task that guides the ViT to discover object-centric semantic correspondence across images. A database of concepts and associated features may be created and used to train the global and local tasks, which may then enable the ViT to perform visual relational reasoning faster, without supervision, and outside of a synthetic domain.
    Type: Application
    Filed: August 22, 2022
    Publication date: March 7, 2024
    Inventors: Xiaojian Ma, Weili Nie, Zhiding Yu, Huaizu Jiang, Chaowei Xiao, Yuke Zhu, Anima Anandkumar
  • Publication number: 20230086694
    Abstract: In accordance with one implementation of the present disclosure, a new approach for determining a movement orientation of a user is proposed in indoor navigation. Generally speaking, a device orientation of a terminal device is obtained based on at least one signal stream collected from the terminal device carried by a moving user. A deviation degree is determined based on the at least one signal stream, here the deviation degree represents a deviation between a movement orientation of the user and an actual device orientation of the terminal device. The movement orientation is determined based on the device orientation in accordance with a determination that the deviation degree is below a threshold degree. With the above implementation, the movement orientation of the user is determined in a more effective an accurate way, and thus accuracy of the indoor navigation is increased.
    Type: Application
    Filed: November 3, 2022
    Publication date: March 23, 2023
    Applicant: BEIJING DIDI INFINITY TECHNOLOGY AND DEVELOPMENT CO., LTD.
    Inventors: Xiaoqiang TENG, Pengfei XU, Chaowei MA, Bin XU, Jun ZHANG, Yiping MENG, Runbo HU, Hua CHAI
  • Publication number: 20230077619
    Abstract: In accordance with one implementation of the present disclosure, a new approach for identifying a stepping event is proposed in indoor navigation. Generally speaking, a first signal fragment and a second signal fragment respectively within a first time window and a second time window in an acceleration signal stream are obtained, here the acceleration signal stream is collected from an acceleration sensor associated with a moving user, the first time window being shorter than the second time window. A first amplitude feature and a second amplitude feature are determined for the first and second time windows based on the first and second signal fragments, respectively. A stepping event of the user is identified based on a deviation between the first and second amplitude features. With the above implementation, the stepping event is identified in a more effective an accurate way, and thus accuracy of the indoor navigation is increased.
    Type: Application
    Filed: November 2, 2022
    Publication date: March 16, 2023
    Applicant: BEIJING DIDI INFINITY TECHNOLOGY AND DEVELOPMENT CO., LTD.
    Inventors: Xiaoqiang TENG, Pengfei XU, Chaowei MA, Bin XU, Jun ZHANG, Yiping MENG, Runbo HU, Hua CHAI