Patents by Inventor Shuping HU

Shuping HU has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240135579
    Abstract: A method for obtaining a feature extraction model, a method for human fall detection and a terminal device are provided. The method for human fall detection includes: inputting a human body image into a feature extraction model for feature extraction to obtain a target image feature; in response to a distance between the target image feature and a pre-stored mean value of standing category image features being greater than or equal to a preset distance threshold, determining that the human body image is a human falling image; and in response to the distance being less than the preset distance threshold, determining that the human body image is a human standing image. The feature extraction model is obtained based on constraint training to aggregate standing category image features and separate falling category image features from the standing category image features.
    Type: Application
    Filed: October 13, 2023
    Publication date: April 25, 2024
    Inventors: Kan WANG, Shuping Hu, Jianxin Pang, Huan Tan
  • Patent number: 11940774
    Abstract: The present disclosure provides action imitation method as well as a robot and a computer readable storage medium using the same. The method includes: collecting at least a two-dimensional image of a to-be-imitated object; obtaining two-dimensional coordinates of each key point of the to-be-imitated object in the two-dimensional image and a pairing relationship between the key points of the to-be-imitated object; converting the two-dimensional coordinates of the key points of the to-be-imitated object in the two-dimensional image into space three-dimensional coordinates corresponding to the key points of the to-be-imitated object through a pre-trained first neural network model, and generating an action control instruction of a robot based on the space three-dimensional coordinates corresponding to the key points of the to-be-imitated object and the pairing relationship between the key points, where the action control instruction is for controlling the robot to imitate an action of the to-be-imitated object.
    Type: Grant
    Filed: December 8, 2020
    Date of Patent: March 26, 2024
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Miaochen Guo, Jingtao Zhang, Dong Wang, Shuping Hu, Jianxin Pang, Youjun Xiong
  • Patent number: 11850747
    Abstract: The present disclosure provides an action imitation method as well as a robot and a computer readable storage medium using the same. The method includes: collecting a plurality of action images of a to-be-imitated object; processing the action images through a pre-trained convolutional neural network to obtain a position coordinate set of position coordinates of a plurality of key points of each of the action images; calculating a rotational angle of each of the linkages of the to-be-imitated object based on the position coordinate sets of the action images; and controlling a robot to move according to the rotational angle of each of the linkages of the to-be-imitated object. In the above-mentioned manner, the rotational angle of each linkage of the to-be-imitated object can be obtained by just analyzing and processing the images collected by an ordinary camera without the help of high-precision depth camera.
    Type: Grant
    Filed: December 4, 2020
    Date of Patent: December 26, 2023
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Miaochen Guo, Jun Cheng, Jingtao Zhang, Shuping Hu, Dong Wang, Jianxin Pang, Youjun Xiong
  • Publication number: 20230386244
    Abstract: A person re-identification method, a storage medium, and a terminal device are provided. In the method, a preset ratio-based triplet loss function is used as a loss function during training The ratio-based triplet loss function limits a ratio of a positive sample feature distance to a negative sample feature distance to be less than a preset ratio threshold. The positive sample feature distance is a distance between a reference image feature and a positive sample image feature, and the negative sample feature distance is a distance between the reference image feature and a negative sample image feature. Compared with the existing absolute distance-based triplet loss function, in the case of small inter-class differences and large intra-class differences, the ratio-based triplet loss function can effectively improve the stability of model training, the features extracted by the trained model are more discriminative and robust, thereby improving the accuracy of person re-identification results.
    Type: Application
    Filed: December 8, 2022
    Publication date: November 30, 2023
    Inventors: SHUPING HU, Kan Wang, Huan Tan, Jianxin Pang
  • Patent number: 11644841
    Abstract: A robot climbing control method is disclosed. A gravity direction vector in a gravity direction in a camera coordinate system of a robot is obtained. A stair edge of stairs in a scene image is obtained and an edge direction vector of the stair edge in the camera coordinate system is determined. A position parameter of the robot relative to the stairs is determined according to the gravity direction vector and the edge direction vector. Poses of the robot are adjusted according to the position parameter to control the robot to climb the stairs.
    Type: Grant
    Filed: December 7, 2020
    Date of Patent: May 9, 2023
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Shuping Hu, Jun Cheng, Jingtao Zhang, Miaochen Guo, Dong Wang, Jianxin Pang, Youjun Xiong
  • Patent number: 11631192
    Abstract: A robot climbing control method is disclosed. The method obtains an RGB color image and a depth image of stairs, extracts an outline of a target object of a target step on the stairs from the RGB color image, determines relative position information of the robot and the target step according to the depth image and the outline of the target object, and controls the robot to climb the target step according to the relative position information. The embodiment of the present disclosure allows the robot to effectively adjust postures and forward directions on any size of and non-standardized stairs and avoids the deviation of the walking direction, thereby improving the effectiveness and safety of the stair climbing of the robot.
    Type: Grant
    Filed: November 30, 2020
    Date of Patent: April 18, 2023
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Shuping Hu, Jun Cheng, Jingtao Zhang, Miaochen Guo, Dong Wang, Jianxin Pang, Youjun Xiong
  • Patent number: 11560192
    Abstract: The present disclosure provides a stair climbing gait planning method and an apparatus and a robot using the same. The method includes: obtaining first visual measurement data through a visual sensor of the robot; converting the first visual measurement data to second visual measurement data; and performing a staged gait planning on a process of the robot to climb the staircase based on the second visual measurement data. Through the method, the visual measurement data is used as a reference to perform the staged gait planning on the process of the robot to climb the staircase, which greatly improves the adaptability of the robot in the complex scene of stair climbing.
    Type: Grant
    Filed: May 27, 2020
    Date of Patent: January 24, 2023
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Jie Bai, Ligang Ge, Hongge Wang, Yizhang Liu, Shuping Hu, Jianxin Pang, Youjun Xiong
  • Publication number: 20220375106
    Abstract: A method includes: performing target detection on a current image to obtain detection information of a plurality of detected targets; obtaining position prediction information of each of a plurality of tracked targets and a number of times of tracking losses of targets from tracking information of each of the tracked targets, and determining a first matching threshold for each of the tracked targets according to the number of times of tracking losses of targets; calculating a motion matching degree between each of the tracked targets and each of the detected targets according to the position detection information and the position prediction information; for each of the tracked targets, obtaining a motion matching result according to the motion matching degree and the first matching threshold corresponding to the tracked target; and matching the detected targets and the tracked targets according to the motion matching results to obtain a tracking result.
    Type: Application
    Filed: July 18, 2022
    Publication date: November 24, 2022
    Inventors: Shuping Hu, Jun Cheng, Jingtao Zhang, Miaochen Guo, Dong Wang, Zaiwang Gu, Jianxin Pang
  • Patent number: 11423701
    Abstract: The present disclosure provides a gesture recognition method as well as a terminal device and a computer-readable storage medium using the same. The method includes: obtaining a video stream collected by an image recording device in real time; performing a hand recognition on the video stream to determine static gesture information of a recognized hand in each video frame of the video stream; encoding the static gesture information in the video frames of the video stream in sequence to obtain an encoded information sequence of the recognized hands; and performing a slide detection on the encoded information sequence using a preset sliding window to determine a dynamic gesture category of each recognized hand. In this manner, static gesture recognition and dynamic gesture recognition are effectively integrated in the same process. The dynamic gesture recognition is realized through the slide detection of the sliding window without complex network calculations.
    Type: Grant
    Filed: December 10, 2020
    Date of Patent: August 23, 2022
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Miaochen Guo, Jingtao Zhang, Shuping Hu, Dong Wang, Zaiwang Gu, Jianxin Pang, Youjun Xiong
  • Publication number: 20210334524
    Abstract: The present disclosure provides a gesture recognition method as well as a terminal device and a computer-readable storage medium using the same. The method includes: obtaining a video stream collected by an image recording device in real time; performing a hand recognition on the video stream to determine static gesture information of a recognized hand in each video frame of the video stream; encoding the static gesture information in the video frames of the video stream in sequence to obtain an encoded information sequence of the recognized hands; and performing a slide detection on the encoded information sequence using a preset sliding window to determine a dynamic gesture category of each recognized hand. In this manner, static gesture recognition and dynamic gesture recognition are effectively integrated in the same process. The dynamic gesture recognition is realized through the slide detection of the sliding window without complex network calculations.
    Type: Application
    Filed: December 10, 2020
    Publication date: October 28, 2021
    Inventors: Miaochen Guo, Jingtao Zhang, Shuping Hu, Dong Wang, Zaiwang Gu, Jianxin Pang, Youjun Xiong
  • Publication number: 20210331753
    Abstract: The present disclosure provides a stair climbing gait planning method and an apparatus and a robot using the same. The method includes: obtaining first visual measurement data through a visual sensor of the robot; converting the first visual measurement data to second visual measurement data; and performing a staged gait planning on a process of the robot to climb the staircase based on the second visual measurement data. Through the method, the visual measurement data is used as a reference to perform the staged gait planning on the process of the robot to climb the staircase, which greatly improves the adaptability of the robot in the complex scene of stair climbing.
    Type: Application
    Filed: May 27, 2020
    Publication date: October 28, 2021
    Inventors: JIE BAI, Ligang Ge, Hongge Wang, Yizhang Liu, Shuping Hu, Jianxin Pang, Youjun Xiong
  • Publication number: 20210200190
    Abstract: The present disclosure provides action imitation method as well as a robot and a computer readable storage medium using the same. The method includes: collecting at least a two-dimensional image of a to-be-imitated object; obtaining two-dimensional coordinates of each key point of the to-be-imitated object in the two-dimensional image and a pairing relationship between the key points of the to-be-imitated object; converting the two-dimensional coordinates of the key points of the to-be-imitated object in the two-dimensional image into space three-dimensional coordinates corresponding to the key points of the to-be-imitated object through a pre-trained first neural network model, and generating an action control instruction of a robot based on the space three-dimensional coordinates corresponding to the key points of the to-be-imitated object and the pairing relationship between the key points, where the action control instruction is for controlling the robot to imitate an action of the to-be-imitated object.
    Type: Application
    Filed: December 8, 2020
    Publication date: July 1, 2021
    Inventors: Miaochen Guo, Jingtao Zhang, Dong Wang, Shuping Hu, Jianxin Pang, Youjun Xiong
  • Publication number: 20210181747
    Abstract: A robot climbing control method is disclosed. A gravity direction vector in a gravity direction in a camera coordinate system of a robot is obtained. A stair edge of stairs in a scene image is obtained and an edge direction vector of the stair edge in the camera coordinate system is determined. A position parameter of the robot relative to the stairs is determined according to the gravity direction vector and the edge direction vector. Poses of the robot are adjusted according to the position parameter to control the robot to climb the stairs.
    Type: Application
    Filed: December 7, 2020
    Publication date: June 17, 2021
    Inventors: Shuping Hu, Jun Cheng, Jingtao Zhang, Miaochen Guo, Dong Wang, Jianxin Pang, Youjun Xiong
  • Publication number: 20210170580
    Abstract: The present disclosure provides an action imitation method as well as a robot and a computer readable storage medium using the same. The method includes: collecting a plurality of action images of a to-be-imitated object; processing the action images through a pre-trained convolutional neural network to obtain a position coordinate set of position coordinates of a plurality of key points of each of the action images; calculating a rotational angle of each of the linkages of the to-be-imitated object based on the position coordinate sets of the action images; and controlling a robot to move according to the rotational angle of each of the linkages of the to-be-imitated object. In the above-mentioned manner, the rotational angle of each linkage of the to-be-imitated object can be obtained by just analyzing and processing the images collected by an ordinary camera without the help of high-precision depth camera.
    Type: Application
    Filed: December 4, 2020
    Publication date: June 10, 2021
    Inventors: Miaochen Guo, Jun Cheng, Jingtao Zhang, Shuping Hu, Dong Wang, Jianxin Pang, Youjun Xiong
  • Publication number: 20210166416
    Abstract: A robot climbing control method is disclosed. The method obtains an RGB color image and a depth image of stairs, extracts an outline of a target object of a target step on the stairs from the RGB color image, determines relative position information of the robot and the target step according to the depth image and the outline of the target object, and controls the robot to climb the target step according to the relative position information. The embodiment of the present disclosure allows the robot to effectively adjust postures and forward directions on any size of and non-standardized stairs and avoids the deviation of the walking direction, thereby improving the effectiveness and safety of the stair climbing of the robot.
    Type: Application
    Filed: November 30, 2020
    Publication date: June 3, 2021
    Inventors: Shuping Hu, Jun Cheng, Jingtao Zhang, Miaochen Guo, Dong Wang, Jianxin Pang, Youjun Xiong
  • Patent number: 9760022
    Abstract: A fine-motion module for use in a wafer stage of a photolithography tool includes: a base (201); a fine-motion plate (210); a plurality of vertical motors (203), fixed between the base and the fine-motion plate; a plurality of gravity compensators (202), each having one end fixed on the base and the other end configured to support the fine-motion plate; a plurality of absolute-position sensors (205, 211), configured to measure an absolute position of the fine-motion plate and to adjust pressures in the gravity compensators based on the obtained absolute-position measurements such that the absolute position of the fine-motion plate is changed to a predetermined initial vertical position; and a plurality of relative-position sensors (204, 207), configured to measure a relative position of the fine-motion plate to the base and to control the fine-motion plate based on the obtained relative-position measurements, thereby moving the fine-motion plate to a relative zero position.
    Type: Grant
    Filed: February 12, 2015
    Date of Patent: September 12, 2017
    Assignee: SHANGHAI MICRO ELECTRONICS EQUIPMENT (GROUP) CO., LTD.
    Inventors: Feihong Liao, Yuebin Zhu, Haili Jia, Shuping Hu
  • Publication number: 20170017168
    Abstract: A fine-motion module for use in a wafer stage of a photolithography tool includes: a base (201); a fine-motion plate (210); a plurality of vertical motors (203), fixed between the base and the fine-motion plate; a plurality of gravity compensators (202), each having one end fixed on the base and the other end configured to support the fine-motion plate; a plurality of absolute-position sensors (205, 211), configured to measure an absolute position of the fine-motion plate and to adjust pressures in the gravity compensators based on the obtained absolute-position measurements such that the absolute position of the fine-motion plate is changed to a predetermined initial vertical position; and a plurality of relative-position sensors (204, 207), configured to measure a relative position of the fine-motion plate to the base and to control the fine-motion plate based on the obtained relative-position measurements, thereby moving the fine-motion plate to a relative zero position.
    Type: Application
    Filed: February 12, 2015
    Publication date: January 19, 2017
    Applicant: Shanghai Micro Electronics Equipment Co., Ltd.
    Inventors: Feihong LIAO, Yuebin ZHU, Haili JIA, Shuping HU