Patents by Inventor Miaochen Guo

Miaochen Guo has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11940774
    Abstract: The present disclosure provides action imitation method as well as a robot and a computer readable storage medium using the same. The method includes: collecting at least a two-dimensional image of a to-be-imitated object; obtaining two-dimensional coordinates of each key point of the to-be-imitated object in the two-dimensional image and a pairing relationship between the key points of the to-be-imitated object; converting the two-dimensional coordinates of the key points of the to-be-imitated object in the two-dimensional image into space three-dimensional coordinates corresponding to the key points of the to-be-imitated object through a pre-trained first neural network model, and generating an action control instruction of a robot based on the space three-dimensional coordinates corresponding to the key points of the to-be-imitated object and the pairing relationship between the key points, where the action control instruction is for controlling the robot to imitate an action of the to-be-imitated object.
    Type: Grant
    Filed: December 8, 2020
    Date of Patent: March 26, 2024
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Miaochen Guo, Jingtao Zhang, Dong Wang, Shuping Hu, Jianxin Pang, Youjun Xiong
  • Patent number: 11850747
    Abstract: The present disclosure provides an action imitation method as well as a robot and a computer readable storage medium using the same. The method includes: collecting a plurality of action images of a to-be-imitated object; processing the action images through a pre-trained convolutional neural network to obtain a position coordinate set of position coordinates of a plurality of key points of each of the action images; calculating a rotational angle of each of the linkages of the to-be-imitated object based on the position coordinate sets of the action images; and controlling a robot to move according to the rotational angle of each of the linkages of the to-be-imitated object. In the above-mentioned manner, the rotational angle of each linkage of the to-be-imitated object can be obtained by just analyzing and processing the images collected by an ordinary camera without the help of high-precision depth camera.
    Type: Grant
    Filed: December 4, 2020
    Date of Patent: December 26, 2023
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Miaochen Guo, Jun Cheng, Jingtao Zhang, Shuping Hu, Dong Wang, Jianxin Pang, Youjun Xiong
  • Patent number: 11644841
    Abstract: A robot climbing control method is disclosed. A gravity direction vector in a gravity direction in a camera coordinate system of a robot is obtained. A stair edge of stairs in a scene image is obtained and an edge direction vector of the stair edge in the camera coordinate system is determined. A position parameter of the robot relative to the stairs is determined according to the gravity direction vector and the edge direction vector. Poses of the robot are adjusted according to the position parameter to control the robot to climb the stairs.
    Type: Grant
    Filed: December 7, 2020
    Date of Patent: May 9, 2023
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Shuping Hu, Jun Cheng, Jingtao Zhang, Miaochen Guo, Dong Wang, Jianxin Pang, Youjun Xiong
  • Patent number: 11636712
    Abstract: A dynamic gesture recognition method includes: performing detection on each frame of image of a video stream using a preset static gesture detection model to obtain a static gesture in each frame of image of the video stream; in response to detection of a change of the static gesture from a preset first gesture to a second gesture, suspending the static gesture detection model and activating a preset dynamic gesture detection model; and performing detection on multiple frames of images that are pre-stored in a storage medium using the dynamic gesture detection model to obtain a dynamic gesture recognition result.
    Type: Grant
    Filed: August 31, 2021
    Date of Patent: April 25, 2023
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Chi Shao, Miaochen Guo, Jun Cheng, Jianxin Pang
  • Patent number: 11631192
    Abstract: A robot climbing control method is disclosed. The method obtains an RGB color image and a depth image of stairs, extracts an outline of a target object of a target step on the stairs from the RGB color image, determines relative position information of the robot and the target step according to the depth image and the outline of the target object, and controls the robot to climb the target step according to the relative position information. The embodiment of the present disclosure allows the robot to effectively adjust postures and forward directions on any size of and non-standardized stairs and avoids the deviation of the walking direction, thereby improving the effectiveness and safety of the stair climbing of the robot.
    Type: Grant
    Filed: November 30, 2020
    Date of Patent: April 18, 2023
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Shuping Hu, Jun Cheng, Jingtao Zhang, Miaochen Guo, Dong Wang, Jianxin Pang, Youjun Xiong
  • Publication number: 20220375106
    Abstract: A method includes: performing target detection on a current image to obtain detection information of a plurality of detected targets; obtaining position prediction information of each of a plurality of tracked targets and a number of times of tracking losses of targets from tracking information of each of the tracked targets, and determining a first matching threshold for each of the tracked targets according to the number of times of tracking losses of targets; calculating a motion matching degree between each of the tracked targets and each of the detected targets according to the position detection information and the position prediction information; for each of the tracked targets, obtaining a motion matching result according to the motion matching degree and the first matching threshold corresponding to the tracked target; and matching the detected targets and the tracked targets according to the motion matching results to obtain a tracking result.
    Type: Application
    Filed: July 18, 2022
    Publication date: November 24, 2022
    Inventors: Shuping Hu, Jun Cheng, Jingtao Zhang, Miaochen Guo, Dong Wang, Zaiwang Gu, Jianxin Pang
  • Patent number: 11423701
    Abstract: The present disclosure provides a gesture recognition method as well as a terminal device and a computer-readable storage medium using the same. The method includes: obtaining a video stream collected by an image recording device in real time; performing a hand recognition on the video stream to determine static gesture information of a recognized hand in each video frame of the video stream; encoding the static gesture information in the video frames of the video stream in sequence to obtain an encoded information sequence of the recognized hands; and performing a slide detection on the encoded information sequence using a preset sliding window to determine a dynamic gesture category of each recognized hand. In this manner, static gesture recognition and dynamic gesture recognition are effectively integrated in the same process. The dynamic gesture recognition is realized through the slide detection of the sliding window without complex network calculations.
    Type: Grant
    Filed: December 10, 2020
    Date of Patent: August 23, 2022
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Miaochen Guo, Jingtao Zhang, Shuping Hu, Dong Wang, Zaiwang Gu, Jianxin Pang, Youjun Xiong
  • Publication number: 20220189208
    Abstract: A gesture recognition method includes: acquiring a target image containing a gesture to be recognized; inputting the target image to a gesture recognition model that has a first sub-model, a second sub-model, and a third sub-model, the first sub-model is to determine a gesture category and a gesture center point, the second sub-model is to determine an offset of the gesture center point, and the third sub-model is to determine a length and a width of a bounding box for the gesture to be recognized; acquiring an output result from the gesture recognition model, the output result includes the gesture category, the gesture center point, and the offset of the gesture center point, and the length and the width of the bounding box; and determining the gesture category and a position of the bounding box of the gesture to be recognized according to the output result.
    Type: Application
    Filed: December 31, 2021
    Publication date: June 16, 2022
    Inventors: Chenghao Qian, Miaochen Guo, Jun Cheng, Jianxin Pang
  • Patent number: 11331806
    Abstract: The present disclosure discloses a robot control method as well as an apparatus, and a robot using the same. The method includes: obtaining a human pose image; obtaining pixel information of key points in the human pose image; obtaining three-dimensional positional information of key points of a human arm according to the pixel information of the preset key points; obtaining a robotic arm kinematics model of a robot; obtaining an angle of each joint in the robotic arm kinematics model according to the three-dimensional positional information of the key points of the human arm and the robotic arm kinematics model; and controlling an arm of the robot to perform a corresponding action according to the angle of each joint. The control method does not require a three-dimensional stereo camera to collect three-dimensional coordinates of a human body, which reduces the cost to a certain extent.
    Type: Grant
    Filed: April 9, 2020
    Date of Patent: May 17, 2022
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Zecai Lin, Miaochen Guo, Yizhang Liu, Youjun Xiong, Jianxin Pang
  • Publication number: 20220067354
    Abstract: A dynamic gesture recognition method includes: performing detection on each frame of image of a video stream using a preset static gesture detection model to obtain a static gesture in each frame of image of the video stream; in response to detection of a change of the static gesture from a preset first gesture to a second gesture, suspending the static gesture detection model and activating a preset dynamic gesture detection model; and performing detection on multiple frames of images that are pre-stored in a storage medium using the dynamic gesture detection model to obtain a dynamic gesture recognition result.
    Type: Application
    Filed: August 31, 2021
    Publication date: March 3, 2022
    Inventors: Chi Shao, Miaochen Guo, Jun Cheng, Jianxin Pang
  • Publication number: 20210334524
    Abstract: The present disclosure provides a gesture recognition method as well as a terminal device and a computer-readable storage medium using the same. The method includes: obtaining a video stream collected by an image recording device in real time; performing a hand recognition on the video stream to determine static gesture information of a recognized hand in each video frame of the video stream; encoding the static gesture information in the video frames of the video stream in sequence to obtain an encoded information sequence of the recognized hands; and performing a slide detection on the encoded information sequence using a preset sliding window to determine a dynamic gesture category of each recognized hand. In this manner, static gesture recognition and dynamic gesture recognition are effectively integrated in the same process. The dynamic gesture recognition is realized through the slide detection of the sliding window without complex network calculations.
    Type: Application
    Filed: December 10, 2020
    Publication date: October 28, 2021
    Inventors: Miaochen Guo, Jingtao Zhang, Shuping Hu, Dong Wang, Zaiwang Gu, Jianxin Pang, Youjun Xiong
  • Publication number: 20210197384
    Abstract: The present disclosure discloses a robot control method as well as an apparatus, and a robot using the same. The method includes: obtaining a human pose image; obtaining pixel information of key points in the human pose image; obtaining three-dimensional positional information of key points of a human arm according to the pixel information of the preset key points; obtaining a robotic arm kinematics model of a robot; obtaining an angle of each joint in the robotic arm kinematics model according to the three-dimensional positional information of the key points of the human arm and the robotic arm kinematics model; and controlling an arm of the robot to perform a corresponding action according to the angle of each joint. The control method does not require a three-dimensional stereo camera to collect three-dimensional coordinates of a human body, which reduces the cost to a certain extent.
    Type: Application
    Filed: April 9, 2020
    Publication date: July 1, 2021
    Inventors: Zecai Lin, Miaochen Guo, Yizhang Liu, Youjun Xiong, Jianxin Pang
  • Publication number: 20210200190
    Abstract: The present disclosure provides action imitation method as well as a robot and a computer readable storage medium using the same. The method includes: collecting at least a two-dimensional image of a to-be-imitated object; obtaining two-dimensional coordinates of each key point of the to-be-imitated object in the two-dimensional image and a pairing relationship between the key points of the to-be-imitated object; converting the two-dimensional coordinates of the key points of the to-be-imitated object in the two-dimensional image into space three-dimensional coordinates corresponding to the key points of the to-be-imitated object through a pre-trained first neural network model, and generating an action control instruction of a robot based on the space three-dimensional coordinates corresponding to the key points of the to-be-imitated object and the pairing relationship between the key points, where the action control instruction is for controlling the robot to imitate an action of the to-be-imitated object.
    Type: Application
    Filed: December 8, 2020
    Publication date: July 1, 2021
    Inventors: Miaochen Guo, Jingtao Zhang, Dong Wang, Shuping Hu, Jianxin Pang, Youjun Xiong
  • Publication number: 20210181747
    Abstract: A robot climbing control method is disclosed. A gravity direction vector in a gravity direction in a camera coordinate system of a robot is obtained. A stair edge of stairs in a scene image is obtained and an edge direction vector of the stair edge in the camera coordinate system is determined. A position parameter of the robot relative to the stairs is determined according to the gravity direction vector and the edge direction vector. Poses of the robot are adjusted according to the position parameter to control the robot to climb the stairs.
    Type: Application
    Filed: December 7, 2020
    Publication date: June 17, 2021
    Inventors: Shuping Hu, Jun Cheng, Jingtao Zhang, Miaochen Guo, Dong Wang, Jianxin Pang, Youjun Xiong
  • Publication number: 20210170580
    Abstract: The present disclosure provides an action imitation method as well as a robot and a computer readable storage medium using the same. The method includes: collecting a plurality of action images of a to-be-imitated object; processing the action images through a pre-trained convolutional neural network to obtain a position coordinate set of position coordinates of a plurality of key points of each of the action images; calculating a rotational angle of each of the linkages of the to-be-imitated object based on the position coordinate sets of the action images; and controlling a robot to move according to the rotational angle of each of the linkages of the to-be-imitated object. In the above-mentioned manner, the rotational angle of each linkage of the to-be-imitated object can be obtained by just analyzing and processing the images collected by an ordinary camera without the help of high-precision depth camera.
    Type: Application
    Filed: December 4, 2020
    Publication date: June 10, 2021
    Inventors: Miaochen Guo, Jun Cheng, Jingtao Zhang, Shuping Hu, Dong Wang, Jianxin Pang, Youjun Xiong
  • Publication number: 20210166416
    Abstract: A robot climbing control method is disclosed. The method obtains an RGB color image and a depth image of stairs, extracts an outline of a target object of a target step on the stairs from the RGB color image, determines relative position information of the robot and the target step according to the depth image and the outline of the target object, and controls the robot to climb the target step according to the relative position information. The embodiment of the present disclosure allows the robot to effectively adjust postures and forward directions on any size of and non-standardized stairs and avoids the deviation of the walking direction, thereby improving the effectiveness and safety of the stair climbing of the robot.
    Type: Application
    Filed: November 30, 2020
    Publication date: June 3, 2021
    Inventors: Shuping Hu, Jun Cheng, Jingtao Zhang, Miaochen Guo, Dong Wang, Jianxin Pang, Youjun Xiong