Patents by Inventor Chenchen Jiang

Chenchen Jiang has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11654572
    Abstract: The present disclosure provides a robot mapping method as well as a robot and a computer readable storage medium using the same. The method includes: detecting a marker with identification information capable of being identified by the robot in a current scene; determining whether the detected marker meets a preset condition; and mapping the current scene based on the marker, if the detected marker meets the preset condition. The robot mapping method can not only map the current scene, but also effectively reduce the difficulty of loops and the number of false loops.
    Type: Grant
    Filed: July 22, 2020
    Date of Patent: May 23, 2023
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Rui Guo, Kun Xie, Chenchen Jiang, Zhichao Liu, Jianxin Pang, Youjun Xiong
  • Patent number: 11579624
    Abstract: The present disclosure provides an autonomous mobile apparatus and a control method thereof. The method includes: starting a SLAM mode; obtaining first image data captured by a first camera; extracting a first tag image of positioning tag(s) from the first image data; calculating a three-dimensional camera coordinate of feature points of the positioning tag(s) in a first camera coordinate system of the first camera based on the first tag image; calculating a three-dimensional world coordinate of the feature points of the positioning tag(s) in a world coordinate system based on a first camera pose of the first camera when obtaining the first image data in the world coordinate system and the three-dimensional camera coordinate; and generating a map file based on the three-dimensional world coordinate of the feature points of the positioning tag(s).
    Type: Grant
    Filed: July 22, 2020
    Date of Patent: February 14, 2023
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Rui Guo, Chenchen Jiang, Kun Xie, Zhichao Liu, Youjun Xiong, Jianxin Pang
  • Patent number: 11423646
    Abstract: A loop closure detection method, a mobile device, and a computer readable storage medium are provided. The method includes: collecting images in different detection directions simultaneously through C0 cameras installed on the mobile device to obtain an image data group comprising C0 images; calculating feature information of each image in the image data group; performing a loop closure detection in C0 sub-threads respectively based on the feature information to obtain a loop closure detection result of each sub-thread; and determining a loop closure detection result of the mobile device based on the loop closure detection result of each sub-thread. In this manner, cross detections in a plurality of detection directions can be realized, which breaks through the limitation of loop closure detection in the prior art with respect to path direction, avoids repeated paths in the same direction, and greatly improves the mapping efficiency.
    Type: Grant
    Filed: November 20, 2020
    Date of Patent: August 23, 2022
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Rui Guo, Zhichao Liu, Chenchen Jiang, Youjun Xiong
  • Patent number: 11416719
    Abstract: The present disclosure provides a localization method as well as a helmet and a computer readable storage medium using the same. The method includes: extracting first feature points from a target image; obtaining inertial information of the carrier, and screening the first feature points based on the inertial information to obtain second feature points; triangulating the second feature points of the target image to generate corresponding initial three-dimensional map points, if the target image is a key frame image; performing a localization error loopback calibration on the initial three-dimensional map points according to at least a predetermined constraint condition to obtain target three-dimensional map points; and determining a positional point of the specific carrier according to the target three-dimensional map points. In this manner, the accuracy of the localization of a dynamic object such as a person when moving can be improved.
    Type: Grant
    Filed: September 3, 2020
    Date of Patent: August 16, 2022
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Chenchen Jiang, Zhichao Liu, Yongsheng Zhao, Yu Tang, Jianxin Pang, Youjun Xiong
  • Patent number: 11285613
    Abstract: The present disclosure provides a robot visual image feature extraction method as well as an apparatus and a robot using the same. The method includes: collecting image data through visual sensor(s) of the robot, and collecting angular velocity data through inertial sensor(s) of the robot; calculating a relative pose between image frames in the image data based on the angular velocity data; extracting feature points of the first image frame in the image data; calculating a projection position of each feature point of the k-th image frame in the k+1-th image frame based on a relative pose between the k-th image frame and the k+1-th image frame; and searching for each feature point in the projection position in the k+1-th image frame, and performing a synchronous positioning and a mapping based on the searched feature point. In this manner, the feature points of dynamic objects are eliminated.
    Type: Grant
    Filed: November 29, 2019
    Date of Patent: March 29, 2022
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Chenchen Jiang, Youjun Xiong, Longbiao Bai, Simin Zhang, Jianxin Pang
  • Patent number: 11260529
    Abstract: The present disclosure provides a virtual rail based cruise method as well as an apparatus and a robot using the same. The method includes: obtaining a digital map including a virtual rail; performing a path planning based on the virtual rail, a current position of the robot, and a cruise end point to obtain a cruise path; and obtaining parameter(s) of the robot by calculating through a preset path tracking algorithm based on the cruise path and the current position of the robot, and controlling the robot based on the control parameter(s). In this manner, the problems of the prior art that needs to lay a rail or set an auxiliary device which causes high cost and inconvenience in usage as well as the rail needs to be re-laid or the auxiliary device needs to be reinstalled when the route is to be changed can be solved.
    Type: Grant
    Filed: September 27, 2019
    Date of Patent: March 1, 2022
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Jian Zhang, Youjun Xiong, Zhichao Liu, Longbiao Bai, Chenchen Jiang, Simin Zhang, Hongjian Liu, Zhanjia Bi, Yongsheng Zhao, Jianxin Pang
  • Publication number: 20220036065
    Abstract: The present disclosure provides a loop closure detection method, a mobile device, and a computer readable storage medium. The method includes: collecting images in different detection directions simultaneously through C0 cameras installed on the mobile device to obtain an image data group comprising C0 images; calculating feature information of each image in the image data group; performing a loop closure detection in C0 sub-threads respectively based on the feature information to obtain a loop closure detection result of each sub-thread; and determining a loop closure detection result of the mobile device based on the loop closure detection result of each sub-thread. In this manner, cross detections in a plurality of detection directions can be realized, which breaks through the limitation of loop closure detection in the prior art with respect to path direction, avoids repeated paths in the same direction, and greatly improves the mapping efficiency.
    Type: Application
    Filed: November 20, 2020
    Publication date: February 3, 2022
    Inventors: Rui Guo, Zhichao Liu, Chenchen Jiang, Youjun Xiong
  • Publication number: 20210197388
    Abstract: The present disclosure provides a robot mapping method as well as a robot and a computer readable storage medium using the same. The method includes: detecting a marker with identification information capable of being identified by the robot in a current scene; determining whether the detected marker meets a preset condition; and mapping the current scene based on the marker, if the detected marker meets the preset condition. The robot mapping method can not only map the current scene, but also effectively reduce the difficulty of loops and the number of false loops.
    Type: Application
    Filed: July 22, 2020
    Publication date: July 1, 2021
    Inventors: RUI GUO, Kun Xie, Chenchen Jiang, Zhichao Liu, Jianxin Pang, Youjun Xiong
  • Publication number: 20210191421
    Abstract: The present disclosure provides an autonomous mobile apparatus and a control method thereof. The method includes: starting a SLAM mode; obtaining first image data captured by a first camera; extracting a first tag image of positioning tag(s) from the first image data; calculating a three-dimensional camera coordinate of feature points of the positioning tag(s) in a first camera coordinate system of the first camera based on the first tag image; calculating a three-dimensional world coordinate of the feature points of the positioning tag(s) in a world coordinate system based on a first camera pose of the first camera when obtaining the first image data in the world coordinate system and the three-dimensional camera coordinate; and generating a map file based on the three-dimensional world coordinate of the feature points of the positioning tag(s).
    Type: Application
    Filed: July 22, 2020
    Publication date: June 24, 2021
    Inventors: Rui GUO, Chenchen JIANG, Kun XIE, Zhichao LIU, Youjun XIONG, Jianxin PANG
  • Publication number: 20210182633
    Abstract: The present disclosure provides a localization method as well as a helmet and a computer readable storage medium using the same. The method includes: extracting first feature points from a target image; obtaining inertial information of the carrier, and screening the first feature points based on the inertial information to obtain second feature points; triangulating the second feature points of the target image to generate corresponding initial three-dimensional map points, if the target image is a key frame image; performing a localization error loopback calibration on the initial three-dimensional map points according to at least a predetermined constraint condition to obtain target three-dimensional map points; and determining a positional point of the specific carrier according to the target three-dimensional map points. In this manner, the accuracy of the localization of a dynamic object such as a person when moving can be improved.
    Type: Application
    Filed: September 3, 2020
    Publication date: June 17, 2021
    Inventors: Chenchen Jiang, Zhichao Liu, Yongsheng Zhao, Yu Tang, Jianxin Pang, Youjun Xiong
  • Patent number: 10783661
    Abstract: The present disclosure provides a positioning method and a robot using the same. The method includes: obtaining, through the visual sensor, a current frame image; obtaining, through the ultra-wideband tag, distance of a robot from an ultra-wideband anchor; performing a feature matching on the current frame image and an adjacent frame image to generate partial map point(s); determining whether the current frame image is a key frame image; and optimizing a pose of the visual sensor corresponding to the key frame image through a joint objective function in response to the current frame image being the key frame image, where the joint objective function at least comprises a distance cost function of the ultra-wideband anchor and a visual residual cost function. Through the above-mentioned method, the accuracy of the positioning of the robot can be improved.
    Type: Grant
    Filed: May 20, 2019
    Date of Patent: September 22, 2020
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Chenchen Jiang, Youjun Xiong, Zhichao Liu
  • Publication number: 20200198135
    Abstract: The present disclosure provides a virtual rail based cruise method as well as an apparatus and a robot using the same. The method includes: obtaining a digital map including a virtual rail; performing a path planning based on the virtual rail, a current position of the robot, and a cruise end point to obtain a cruise path; and obtaining parameter(s) of the robot by calculating through a preset path tracking algorithm based on the cruise path and the current position of the robot, and controlling the robot based on the control parameter(s). In this manner, the problems of the prior art that needs to lay a rail or set an auxiliary device which causes high cost and inconvenience in usage as well as the rail needs to be re-laid or the auxiliary device needs to be reinstalled when the route is to be changed can be solved.
    Type: Application
    Filed: September 27, 2019
    Publication date: June 25, 2020
    Inventors: JIAN ZHANG, Youjun Xiong, Zhichao Liu, Longbiao Bai, Chenchen Jiang, Simin Zhang, Hongjian Liu, Zhanjia Bi, Yongsheng Zhao, Jianxin Pang
  • Publication number: 20200198149
    Abstract: The present disclosure provides a robot visual image feature extraction method as well as an apparatus and a robot using the same. The method includes: collecting image data through visual sensor(s) of the robot, and collecting angular velocity data through inertial sensor(s) of the robot; calculating a relative pose between image frames in the image data based on the angular velocity data; extracting feature points of the first image frame in the image data; calculating a projection position of each feature point of the k-th image frame in the k+1-th image frame based on a relative pose between the k-th image frame and the k+1-th image frame; and searching for each feature point in the projection position in the k+1-th image frame, and performing a synchronous positioning and a mapping based on the searched feature point. In this manner, the feature points of dynamic objects are eliminated.
    Type: Application
    Filed: November 29, 2019
    Publication date: June 25, 2020
    Inventors: Chenchen Jiang, Youjun Xiong, Longbiao Bai, Simin Zhang, Jianxin Pang
  • Publication number: 20200116498
    Abstract: The present disclosure provides a visual assisted distance-based SLAM method for a mobile robot, and a mobile robot using the same. The method includes: obtaining distance data frames and visual data frames, each of the visual data frames corresponds to one of the distance data frames, performing a loop closure detection based on a current visual data frame in the visual data frames to find a matched visual data frame; calculating a relative pose between the current visual data frame and the matched visual data frame; and performing a loop closure optimization on pose data of one or more frames between the current visual data frame and the matched visual data frame based on the relative pose. In the above-mentioned manner, the present disclosure can improve the accuracy of mapping and/or realizing fast relocalization.
    Type: Application
    Filed: December 21, 2018
    Publication date: April 16, 2020
    Inventors: Youjun Xiong, Chenchen Jiang, Longbiao Bai, Zhanjia Bi, Zhichao Liu
  • Publication number: 20200005487
    Abstract: The present disclosure provides a positioning method and a robot using the same. The method includes: obtaining, through the visual sensor, a current frame image; obtaining, through the ultra-wideband tag, distance of a robot from an ultra-wideband anchor; performing a feature matching on the current frame image and an adjacent frame image to generate partial map point(s); determining whether the current frame image is a key frame image; and optimizing a pose of the visual sensor corresponding to the key frame image through a joint objective function in response to the current frame image being the key frame image, where the joint objective function at least comprises a distance cost function of the ultra-wideband anchor and a visual residual cost function. Through the above-mentioned method, the accuracy of the positioning of the robot can be improved.
    Type: Application
    Filed: May 20, 2019
    Publication date: January 2, 2020
    Inventors: Chenchen Jiang, Youjun Xiong, Zhichao Liu
  • Patent number: 10254131
    Abstract: A detection device, a substrate holder and a method for detecting a position of a substrate on a substrate holder are disclosed. The detection device of the present disclosure is used to detect the position of the substrate carried on the substrate holder, and the substrate holder includes a plurality of carrying positions, each of which is used to carry a substrate. The detection device includes an emitting electrode connected to a signal source, which is disposed at an edge of each carrying position and located at one of upper and lower sides of the substrate carried by the carrying position; and at least one receiving electrode connected to a detector, which is disposed opposite to the emitting electrode and located at the other of the upper and lower sides of the substrate carried by the carrying position.
    Type: Grant
    Filed: September 22, 2016
    Date of Patent: April 9, 2019
    Assignees: BOE TECHNOLOGY GROUP CO., LTD., HEFEI BOE OPTOELECTRONICS TECHNOLOGY CO., LTD.
    Inventors: Guodong Li, Zhen Wei, Shibo Guo, Weiwei Sun, Chenchen Jiang, Qiong Zhao
  • Publication number: 20170299407
    Abstract: A detection device, a substrate holder and a method for detecting a position of a substrate on a substrate holder are disclosed. The detection device of the present disclosure is used to detect the position of the substrate carried on the substrate holder, and the substrate holder includes a plurality of carrying positions, each of which is used to carry a substrate. The detection device includes an emitting electrode connected to a signal source, which is disposed at an edge of each carrying position and located at one of upper and lower sides of the substrate carried by the carrying position; and at least one receiving electrode connected to a detector, which is disposed opposite to the emitting electrode and located at the other of the upper and lower sides of the substrate carried by the carrying position.
    Type: Application
    Filed: September 22, 2016
    Publication date: October 19, 2017
    Applicants: BOE Technology Group Co., Ltd., Hefei Boe Optoelectronics Technology Co., Ltd.
    Inventors: Guodong Li, Zhen Wei, Shibo Guo, Weiwei Sun, Chenchen Jiang, Qiong Zhao