Patents by Inventor Shuping HU
Shuping HU has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240290096Abstract: A method for extracting video frame features includes: obtaining a number of initial features of each video frame in a video sequence; calculating global channel attention information of the video sequence based on the initial features of each video frame in the video sequence; calculating local channel attention information of a target video frame according to initial features of a target video frame; wherein the target video frame is one of the video frames in the video sequence; and performing channel attention mechanism processing on the initial features of the target video frame according to the global channel attention information and the local channel attention information to obtain optimized features of the target video frame.Type: ApplicationFiled: January 25, 2024Publication date: August 29, 2024Inventors: Kan Wang, Shuping Hu, Jianxin Pang, Huan Tan
-
Publication number: 20240135579Abstract: A method for obtaining a feature extraction model, a method for human fall detection and a terminal device are provided. The method for human fall detection includes: inputting a human body image into a feature extraction model for feature extraction to obtain a target image feature; in response to a distance between the target image feature and a pre-stored mean value of standing category image features being greater than or equal to a preset distance threshold, determining that the human body image is a human falling image; and in response to the distance being less than the preset distance threshold, determining that the human body image is a human standing image. The feature extraction model is obtained based on constraint training to aggregate standing category image features and separate falling category image features from the standing category image features.Type: ApplicationFiled: October 13, 2023Publication date: April 25, 2024Inventors: Kan WANG, Shuping Hu, Jianxin Pang, Huan Tan
-
Patent number: 11940774Abstract: The present disclosure provides action imitation method as well as a robot and a computer readable storage medium using the same. The method includes: collecting at least a two-dimensional image of a to-be-imitated object; obtaining two-dimensional coordinates of each key point of the to-be-imitated object in the two-dimensional image and a pairing relationship between the key points of the to-be-imitated object; converting the two-dimensional coordinates of the key points of the to-be-imitated object in the two-dimensional image into space three-dimensional coordinates corresponding to the key points of the to-be-imitated object through a pre-trained first neural network model, and generating an action control instruction of a robot based on the space three-dimensional coordinates corresponding to the key points of the to-be-imitated object and the pairing relationship between the key points, where the action control instruction is for controlling the robot to imitate an action of the to-be-imitated object.Type: GrantFiled: December 8, 2020Date of Patent: March 26, 2024Assignee: UBTECH ROBOTICS CORP LTDInventors: Miaochen Guo, Jingtao Zhang, Dong Wang, Shuping Hu, Jianxin Pang, Youjun Xiong
-
Patent number: 11850747Abstract: The present disclosure provides an action imitation method as well as a robot and a computer readable storage medium using the same. The method includes: collecting a plurality of action images of a to-be-imitated object; processing the action images through a pre-trained convolutional neural network to obtain a position coordinate set of position coordinates of a plurality of key points of each of the action images; calculating a rotational angle of each of the linkages of the to-be-imitated object based on the position coordinate sets of the action images; and controlling a robot to move according to the rotational angle of each of the linkages of the to-be-imitated object. In the above-mentioned manner, the rotational angle of each linkage of the to-be-imitated object can be obtained by just analyzing and processing the images collected by an ordinary camera without the help of high-precision depth camera.Type: GrantFiled: December 4, 2020Date of Patent: December 26, 2023Assignee: UBTECH ROBOTICS CORP LTDInventors: Miaochen Guo, Jun Cheng, Jingtao Zhang, Shuping Hu, Dong Wang, Jianxin Pang, Youjun Xiong
-
Publication number: 20230386244Abstract: A person re-identification method, a storage medium, and a terminal device are provided. In the method, a preset ratio-based triplet loss function is used as a loss function during training The ratio-based triplet loss function limits a ratio of a positive sample feature distance to a negative sample feature distance to be less than a preset ratio threshold. The positive sample feature distance is a distance between a reference image feature and a positive sample image feature, and the negative sample feature distance is a distance between the reference image feature and a negative sample image feature. Compared with the existing absolute distance-based triplet loss function, in the case of small inter-class differences and large intra-class differences, the ratio-based triplet loss function can effectively improve the stability of model training, the features extracted by the trained model are more discriminative and robust, thereby improving the accuracy of person re-identification results.Type: ApplicationFiled: December 8, 2022Publication date: November 30, 2023Inventors: SHUPING HU, Kan Wang, Huan Tan, Jianxin Pang
-
Patent number: 11644841Abstract: A robot climbing control method is disclosed. A gravity direction vector in a gravity direction in a camera coordinate system of a robot is obtained. A stair edge of stairs in a scene image is obtained and an edge direction vector of the stair edge in the camera coordinate system is determined. A position parameter of the robot relative to the stairs is determined according to the gravity direction vector and the edge direction vector. Poses of the robot are adjusted according to the position parameter to control the robot to climb the stairs.Type: GrantFiled: December 7, 2020Date of Patent: May 9, 2023Assignee: UBTECH ROBOTICS CORP LTDInventors: Shuping Hu, Jun Cheng, Jingtao Zhang, Miaochen Guo, Dong Wang, Jianxin Pang, Youjun Xiong
-
Patent number: 11631192Abstract: A robot climbing control method is disclosed. The method obtains an RGB color image and a depth image of stairs, extracts an outline of a target object of a target step on the stairs from the RGB color image, determines relative position information of the robot and the target step according to the depth image and the outline of the target object, and controls the robot to climb the target step according to the relative position information. The embodiment of the present disclosure allows the robot to effectively adjust postures and forward directions on any size of and non-standardized stairs and avoids the deviation of the walking direction, thereby improving the effectiveness and safety of the stair climbing of the robot.Type: GrantFiled: November 30, 2020Date of Patent: April 18, 2023Assignee: UBTECH ROBOTICS CORP LTDInventors: Shuping Hu, Jun Cheng, Jingtao Zhang, Miaochen Guo, Dong Wang, Jianxin Pang, Youjun Xiong
-
Patent number: 11560192Abstract: The present disclosure provides a stair climbing gait planning method and an apparatus and a robot using the same. The method includes: obtaining first visual measurement data through a visual sensor of the robot; converting the first visual measurement data to second visual measurement data; and performing a staged gait planning on a process of the robot to climb the staircase based on the second visual measurement data. Through the method, the visual measurement data is used as a reference to perform the staged gait planning on the process of the robot to climb the staircase, which greatly improves the adaptability of the robot in the complex scene of stair climbing.Type: GrantFiled: May 27, 2020Date of Patent: January 24, 2023Assignee: UBTECH ROBOTICS CORP LTDInventors: Jie Bai, Ligang Ge, Hongge Wang, Yizhang Liu, Shuping Hu, Jianxin Pang, Youjun Xiong
-
Publication number: 20220375106Abstract: A method includes: performing target detection on a current image to obtain detection information of a plurality of detected targets; obtaining position prediction information of each of a plurality of tracked targets and a number of times of tracking losses of targets from tracking information of each of the tracked targets, and determining a first matching threshold for each of the tracked targets according to the number of times of tracking losses of targets; calculating a motion matching degree between each of the tracked targets and each of the detected targets according to the position detection information and the position prediction information; for each of the tracked targets, obtaining a motion matching result according to the motion matching degree and the first matching threshold corresponding to the tracked target; and matching the detected targets and the tracked targets according to the motion matching results to obtain a tracking result.Type: ApplicationFiled: July 18, 2022Publication date: November 24, 2022Inventors: Shuping Hu, Jun Cheng, Jingtao Zhang, Miaochen Guo, Dong Wang, Zaiwang Gu, Jianxin Pang
-
Patent number: 11423701Abstract: The present disclosure provides a gesture recognition method as well as a terminal device and a computer-readable storage medium using the same. The method includes: obtaining a video stream collected by an image recording device in real time; performing a hand recognition on the video stream to determine static gesture information of a recognized hand in each video frame of the video stream; encoding the static gesture information in the video frames of the video stream in sequence to obtain an encoded information sequence of the recognized hands; and performing a slide detection on the encoded information sequence using a preset sliding window to determine a dynamic gesture category of each recognized hand. In this manner, static gesture recognition and dynamic gesture recognition are effectively integrated in the same process. The dynamic gesture recognition is realized through the slide detection of the sliding window without complex network calculations.Type: GrantFiled: December 10, 2020Date of Patent: August 23, 2022Assignee: UBTECH ROBOTICS CORP LTDInventors: Miaochen Guo, Jingtao Zhang, Shuping Hu, Dong Wang, Zaiwang Gu, Jianxin Pang, Youjun Xiong
-
Publication number: 20210331753Abstract: The present disclosure provides a stair climbing gait planning method and an apparatus and a robot using the same. The method includes: obtaining first visual measurement data through a visual sensor of the robot; converting the first visual measurement data to second visual measurement data; and performing a staged gait planning on a process of the robot to climb the staircase based on the second visual measurement data. Through the method, the visual measurement data is used as a reference to perform the staged gait planning on the process of the robot to climb the staircase, which greatly improves the adaptability of the robot in the complex scene of stair climbing.Type: ApplicationFiled: May 27, 2020Publication date: October 28, 2021Inventors: JIE BAI, Ligang Ge, Hongge Wang, Yizhang Liu, Shuping Hu, Jianxin Pang, Youjun Xiong
-
Publication number: 20210334524Abstract: The present disclosure provides a gesture recognition method as well as a terminal device and a computer-readable storage medium using the same. The method includes: obtaining a video stream collected by an image recording device in real time; performing a hand recognition on the video stream to determine static gesture information of a recognized hand in each video frame of the video stream; encoding the static gesture information in the video frames of the video stream in sequence to obtain an encoded information sequence of the recognized hands; and performing a slide detection on the encoded information sequence using a preset sliding window to determine a dynamic gesture category of each recognized hand. In this manner, static gesture recognition and dynamic gesture recognition are effectively integrated in the same process. The dynamic gesture recognition is realized through the slide detection of the sliding window without complex network calculations.Type: ApplicationFiled: December 10, 2020Publication date: October 28, 2021Inventors: Miaochen Guo, Jingtao Zhang, Shuping Hu, Dong Wang, Zaiwang Gu, Jianxin Pang, Youjun Xiong
-
Publication number: 20210200190Abstract: The present disclosure provides action imitation method as well as a robot and a computer readable storage medium using the same. The method includes: collecting at least a two-dimensional image of a to-be-imitated object; obtaining two-dimensional coordinates of each key point of the to-be-imitated object in the two-dimensional image and a pairing relationship between the key points of the to-be-imitated object; converting the two-dimensional coordinates of the key points of the to-be-imitated object in the two-dimensional image into space three-dimensional coordinates corresponding to the key points of the to-be-imitated object through a pre-trained first neural network model, and generating an action control instruction of a robot based on the space three-dimensional coordinates corresponding to the key points of the to-be-imitated object and the pairing relationship between the key points, where the action control instruction is for controlling the robot to imitate an action of the to-be-imitated object.Type: ApplicationFiled: December 8, 2020Publication date: July 1, 2021Inventors: Miaochen Guo, Jingtao Zhang, Dong Wang, Shuping Hu, Jianxin Pang, Youjun Xiong
-
Publication number: 20210181747Abstract: A robot climbing control method is disclosed. A gravity direction vector in a gravity direction in a camera coordinate system of a robot is obtained. A stair edge of stairs in a scene image is obtained and an edge direction vector of the stair edge in the camera coordinate system is determined. A position parameter of the robot relative to the stairs is determined according to the gravity direction vector and the edge direction vector. Poses of the robot are adjusted according to the position parameter to control the robot to climb the stairs.Type: ApplicationFiled: December 7, 2020Publication date: June 17, 2021Inventors: Shuping Hu, Jun Cheng, Jingtao Zhang, Miaochen Guo, Dong Wang, Jianxin Pang, Youjun Xiong
-
Publication number: 20210170580Abstract: The present disclosure provides an action imitation method as well as a robot and a computer readable storage medium using the same. The method includes: collecting a plurality of action images of a to-be-imitated object; processing the action images through a pre-trained convolutional neural network to obtain a position coordinate set of position coordinates of a plurality of key points of each of the action images; calculating a rotational angle of each of the linkages of the to-be-imitated object based on the position coordinate sets of the action images; and controlling a robot to move according to the rotational angle of each of the linkages of the to-be-imitated object. In the above-mentioned manner, the rotational angle of each linkage of the to-be-imitated object can be obtained by just analyzing and processing the images collected by an ordinary camera without the help of high-precision depth camera.Type: ApplicationFiled: December 4, 2020Publication date: June 10, 2021Inventors: Miaochen Guo, Jun Cheng, Jingtao Zhang, Shuping Hu, Dong Wang, Jianxin Pang, Youjun Xiong
-
Publication number: 20210166416Abstract: A robot climbing control method is disclosed. The method obtains an RGB color image and a depth image of stairs, extracts an outline of a target object of a target step on the stairs from the RGB color image, determines relative position information of the robot and the target step according to the depth image and the outline of the target object, and controls the robot to climb the target step according to the relative position information. The embodiment of the present disclosure allows the robot to effectively adjust postures and forward directions on any size of and non-standardized stairs and avoids the deviation of the walking direction, thereby improving the effectiveness and safety of the stair climbing of the robot.Type: ApplicationFiled: November 30, 2020Publication date: June 3, 2021Inventors: Shuping Hu, Jun Cheng, Jingtao Zhang, Miaochen Guo, Dong Wang, Jianxin Pang, Youjun Xiong
-
Patent number: 9760022Abstract: A fine-motion module for use in a wafer stage of a photolithography tool includes: a base (201); a fine-motion plate (210); a plurality of vertical motors (203), fixed between the base and the fine-motion plate; a plurality of gravity compensators (202), each having one end fixed on the base and the other end configured to support the fine-motion plate; a plurality of absolute-position sensors (205, 211), configured to measure an absolute position of the fine-motion plate and to adjust pressures in the gravity compensators based on the obtained absolute-position measurements such that the absolute position of the fine-motion plate is changed to a predetermined initial vertical position; and a plurality of relative-position sensors (204, 207), configured to measure a relative position of the fine-motion plate to the base and to control the fine-motion plate based on the obtained relative-position measurements, thereby moving the fine-motion plate to a relative zero position.Type: GrantFiled: February 12, 2015Date of Patent: September 12, 2017Assignee: SHANGHAI MICRO ELECTRONICS EQUIPMENT (GROUP) CO., LTD.Inventors: Feihong Liao, Yuebin Zhu, Haili Jia, Shuping Hu
-
Publication number: 20170017168Abstract: A fine-motion module for use in a wafer stage of a photolithography tool includes: a base (201); a fine-motion plate (210); a plurality of vertical motors (203), fixed between the base and the fine-motion plate; a plurality of gravity compensators (202), each having one end fixed on the base and the other end configured to support the fine-motion plate; a plurality of absolute-position sensors (205, 211), configured to measure an absolute position of the fine-motion plate and to adjust pressures in the gravity compensators based on the obtained absolute-position measurements such that the absolute position of the fine-motion plate is changed to a predetermined initial vertical position; and a plurality of relative-position sensors (204, 207), configured to measure a relative position of the fine-motion plate to the base and to control the fine-motion plate based on the obtained relative-position measurements, thereby moving the fine-motion plate to a relative zero position.Type: ApplicationFiled: February 12, 2015Publication date: January 19, 2017Applicant: Shanghai Micro Electronics Equipment Co., Ltd.Inventors: Feihong LIAO, Yuebin ZHU, Haili JIA, Shuping HU