Patents by Inventor Youjun Xiong

Youjun Xiong has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11372414
    Abstract: A robotic motion control method provided by the present disclosure includes: obtaining a position and orientation of a starting point where the robot is currently located through a positioning sensor, and obtaining a position and orientation of a preset target point where the robot is moved to; determining an arc path and a straight path of the robot according to the position and orientation of the starting point, the position and orientation of the preset target point, and a preset arc radius; and moving the robot to the preset target point according to the determined arc path and straight path. Because there are only pure circular motion and pure linear motion which are simple during the movement of the robot, it is beneficial to improve the precision of the motion control of the robot and enable the robot to reach the target position in a reliable manner.
    Type: Grant
    Filed: March 12, 2020
    Date of Patent: June 28, 2022
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Xiangbin Huang, Musen Zhang, Wenzhi Xu, Youjun Xiong
  • Patent number: 11373443
    Abstract: The present disclosure provides a method and an apparatus for face recognition and a computer readable storage medium. The method includes: inputting a to-be-recognized blurry face image into a generator of a trained generative adversarial network to obtain a to-be-recognized clear face image; inputting the to-be-recognized clear face image to the feature extraction network to obtain a facial feature of the to-be-recognized clear face image; matching the facial feature of the to-be-recognized clear face image with each user facial feature in a preset facial feature database to determine the user facial feature best matching the to-be-recognized clear face image as a target user facial feature; and determining a user associated with the target user facial feature as a recognition result. Through this solution, the accuracy of the recognition of blurry faces can be improved.
    Type: Grant
    Filed: November 27, 2020
    Date of Patent: June 28, 2022
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Yue Wang, Jun Cheng, Yepeng Liu, Yusheng Zeng, Jianxin Pang, Youjun Xiong
  • Publication number: 20220194500
    Abstract: A stepping down trajectory planning method as well as a robot using the same and a computer readable storage medium are provided. The method includes: dividing a stepping down process of the robot into a plurality of planned stages; adjusting a start position of a swing leg of the robot according to an ankle-to-heel distance, where the ankle-to-heel distance is a horizontal distance between an ankle joint of the swing leg of the robot and a heel of the swing leg of the robot; determining an initial state and an end state of the swing leg in each of the planned stages according to the start position; and obtaining a planned trajectory of the swing leg by performing a curve fitting on the swing leg in each of the planned stages the initial state and the end state.
    Type: Application
    Filed: December 27, 2021
    Publication date: June 23, 2022
    Inventors: Hongge Wang, Ligang Ge, Yizhang Liu, Jie Bai, Chunyu Chen, Xingxing Ma, Jiangchen Zhou, Youjun Xiong
  • Publication number: 20220193899
    Abstract: A pose control method for a robot includes: estimating a first set of joint angular velocities of all joints of the robot according to a balance control algorithm; estimating a second set of joint angular velocities of all joints of the robot according to a momentum planning algorithm; estimating a third set of joint angular velocities of all joints of the robot according to a pose return-to-zero algorithm; and performing pose control on the robot according to the first set of joint angular velocities, the second set of joint angular velocities, and the third set of joint angular velocities.
    Type: Application
    Filed: September 29, 2021
    Publication date: June 23, 2022
    Inventors: Hongge Wang, Chunyu Chen, Yizhang Liu, Ligang Ge, Jie Bai, Xingxing Ma, Jiangchen Zhou, Youjun Xiong
  • Publication number: 20220193902
    Abstract: A total centroid state estimation method as well as a humanoid robot and a computer readable storage medium using the same are provided. The method includes: obtaining a motion state of each real joint of the humanoid robot and a motion state of its floating base, where the floating base is equivalent to a plurality of sequent-connected virtual joints; calculating a joint position, a centroid position, and a rotation matrix of each link in the world coordinate system in sequence using the chain rule of homogeneous multiplication according to the position of the joint corresponding to the link to solve a Jacobian matrix of the centroid of the link; solving a total centroid Jacobian matrix based on the Jacobian matrix of the centroid of each link and the total mass; and calculating the total centroid velocity based on the total centroid Jacobian matrix and other parameters.
    Type: Application
    Filed: September 25, 2021
    Publication date: June 23, 2022
    Inventors: Xiaozhu Ju, Yuesong Wang, Mingguo Zhao, Youjun Xiong
  • Patent number: 11367456
    Abstract: The present disclosure provides a streaming voice conversion method as well as an apparatus and a computer readable storage medium using the same. The method includes: obtaining to-be-converted voice data; partitioning the to-be-converted voice data in an order of data obtaining time as a plurality of to-be-converted partition voices, where the to-be-converted partition voice data carries a partition mark; performing a voice conversion on each of the to-be-converted partition voices to obtain a converted partition voice, where the converted partition voice carries a partition mark; performing a partition restoration on each of the converted partition voices to obtain a restored partition voice, where the restored partition voice carries a partition mark; and outputting each of the restored partition voices according to the partition mark carried by the restored partition voice. In this manner, the response time is shortened, and the conversion speed is improved.
    Type: Grant
    Filed: December 3, 2020
    Date of Patent: June 21, 2022
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Jiebin Xie, Ruotong Wang, Dongyan Huang, Zhichao Tang, Yang Liu, Youjun Xiong
  • Publication number: 20220184808
    Abstract: A motion trajectory planning method for a robotic manipulator having a visual inspection system, includes: in response to a command instruction, obtaining environmental data collected by the visual inspection system; determining an initial DS model motion trajectory of the robotic manipulator according to the command instruction, the environmental data, and a preset teaching motion DS model library, wherein the teaching motion DS model library includes at least one DS model motion trajectory generated based on human teaching activities; and at least based on a result of determining whether there is an obstacle, whose pose is on the initial DS model motion trajectory, in a first object included in the environmental data, correcting the initial DS model motion trajectory to obtain a desired motion trajectory of the robotic manipulator.
    Type: Application
    Filed: December 31, 2021
    Publication date: June 16, 2022
    Inventors: Dake Zheng, Yizhang Liu, Jianxin Pang, Huan Tan, Youjun Xiong
  • Publication number: 20220189454
    Abstract: A computer-implemented method for speech synthesis, a computer device, and a non-transitory computer readable storage medium are provided. The method includes: obtaining a speech text to be synthesized; obtaining a Mel spectrum corresponding to the speech text to be synthesized according to the speech text to be synthesized; inputting the Mel spectrum into a complex neural network, and obtaining a complex spectrum corresponding to the speech text to be synthesized, wherein the complex spectrum comprises real component information and imaginary component information; and obtaining a synthetic speech corresponding to the speech text to be synthesized, according to the complex spectrum. The method can efficiently and simply complete speech synthesis.
    Type: Application
    Filed: December 10, 2020
    Publication date: June 16, 2022
    Inventors: Dongyan Huang, Leyuan Sheng, Youjun Xiong
  • Publication number: 20220184807
    Abstract: The present disclosure provides a humanoid gait control method, device, apparatus and storage medium of humanoid robots. The method includes: obtaining a first vector from a virtual centroid to an ankle joint of a left leg of the humanoid robot at a current moment and a second vector from the virtual centroid to an ankle joint of a right leg at the current moment, and obtaining an original planning value of the virtual centroid of the current moment of the humanoid robot; determining a height of the target virtual centroid of the humanoid robot after the virtual centroid is reduced at the current moment according to the first vector, the second vector, the original planning value of the virtual centroid and a preset virtual centroid height reduction algorithm; and controlling the humanoid robot to walk on straight knees according to the height of the target virtual centroid.
    Type: Application
    Filed: December 30, 2020
    Publication date: June 16, 2022
    Inventors: Jie Bai, Chunyu Chen, Ligang Ge, Yizhang Liu, Youjun Xiong
  • Patent number: 11353887
    Abstract: The present disclosure provides a robot centroid position adjustment method as well as an apparatus and a robot using the same. The method includes: obtaining initial values; obtaining a waist velocity adjustment value; calculating a current value of the centroid position; and determining whether a current value of the centroid position is equal to the planning value of the centroid position; if the current value of the centroid position is not equal to the planning value of the centroid position, obtaining the current value of the centroid position to take as the initial value of the centroid position and returning to the step of obtaining the waist velocity adjustment value until the current value of the centroid position is equal to the planning value of the centroid position. In such a manner, the balance ability of the robot can be improved.
    Type: Grant
    Filed: September 30, 2019
    Date of Patent: June 7, 2022
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Youjun Xiong, Yuesong Wang, Mingguo Zhao
  • Patent number: 11346648
    Abstract: The present invention provides a rotation angle detection method and device thereof. The method includes calculating an estimated value of a rotation angle of a motor shaft during rotation according to a second rotation angle; determining an actual range of the rotation angle according to the estimated value of the rotation angle and a detection error of the second angle sensor; determining optional values of the rotation angle based on a relative relationship between a first rotation angle and the estimated value; determining an actual rotation angle of the motor shaft, based on a value falling within the actual range of the rotation angle among the optional values, and determining an actual rotation angle of the output shaft according to the actual rotation angle of the motor shaft. The present invention can improve the measurement accuracy of the rotation angles of the output shaft of the rotating mechanism.
    Type: Grant
    Filed: December 29, 2018
    Date of Patent: May 31, 2022
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Xi Bai, Wenhua Fan, Malin Wang, Youjun Xiong
  • Publication number: 20220152827
    Abstract: A biped robot gait control method as well as a robot and a computer readable storage medium are provided. During the movement, the system obtains a current supporting pose of a current supporting leg of the biped robot, and calculates a relative pose between the supporting legs based on the current supporting pose and a preset ideal supporting pose of a next step. The system further calculates modified gait parameters of the next step based on the relative pose between the two supporting legs and a joint distance between left and right ankle joints in an initial state of the biped robot when standing. Finally, the system controls the next supporting leg to move according to the modified gait parameters.
    Type: Application
    Filed: May 6, 2021
    Publication date: May 19, 2022
    Inventors: Xingxing Ma, Chunyu Chen, Ligang Ge, Hongge Wang, Mingqiang Huang, Jiangchen Zhou, Yizhang Liu, Zheng Xie, Youjun Xiong
  • Publication number: 20220156534
    Abstract: A target object detection model is provided. The target object detection model includes a YOLOv3-Tiny model. Through the target object detection model, low-level information in the YOLOv3-Tiny sub-model can be merged with high-level information therein, so as to fuse the low-level information and the high-level information. Since the low-level information can be further used, the comprehensiveness of target detection is effectively improved, and the detection effect of small targets is improved.
    Type: Application
    Filed: July 30, 2021
    Publication date: May 19, 2022
    Inventors: Yonghui Cai, Jun Cheng, Jianxin Pang, Youjun Xiong
  • Patent number: 11331806
    Abstract: The present disclosure discloses a robot control method as well as an apparatus, and a robot using the same. The method includes: obtaining a human pose image; obtaining pixel information of key points in the human pose image; obtaining three-dimensional positional information of key points of a human arm according to the pixel information of the preset key points; obtaining a robotic arm kinematics model of a robot; obtaining an angle of each joint in the robotic arm kinematics model according to the three-dimensional positional information of the key points of the human arm and the robotic arm kinematics model; and controlling an arm of the robot to perform a corresponding action according to the angle of each joint. The control method does not require a three-dimensional stereo camera to collect three-dimensional coordinates of a human body, which reduces the cost to a certain extent.
    Type: Grant
    Filed: April 9, 2020
    Date of Patent: May 17, 2022
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Zecai Lin, Miaochen Guo, Yizhang Liu, Youjun Xiong, Jianxin Pang
  • Patent number: 11331802
    Abstract: A method for controlling an arm of a robot to imitate a human arm, includes: acquiring first pose information of key points of a human arm to be imitated; converting the first pose information into second pose information of key points of an arm of a robot; determining an angle value of each joint of the arm according to inverse kinematics of the arm based on the second pose information; and controlling the arm to move according to the angle values.
    Type: Grant
    Filed: January 5, 2020
    Date of Patent: May 17, 2022
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Zecai Lin, Zhaohui An, Zheng Xie, Yizhang Liu, Youjun Xiong, Jianxin Pang
  • Publication number: 20220143841
    Abstract: A robotic hand includes a baseplate, a finger having multiple phalanges that are rotatably coupled to one another, a first of the phalanges having a first end rotatably coupled to the baseplate and a second end and a second of the phalanges rotatably coupled to the second end about an axis of rotation, an actuating mechanism mounted on the baseplate, the actuating mechanism configured to actuate rotation of the plurality of phalanges, and a tendon having opposite ends that are respectively attached to the second of the phalanges and the baseplate. The second of the phalanges has an engagement portion arranged around the axis of rotation, and the tendon is wrapped around a portion of the engagement portion to generate a force acting on the second end of the first of the phalanges, causing the first of the phalanges to rotate from a flexed state to an extended state.
    Type: Application
    Filed: November 12, 2020
    Publication date: May 12, 2022
    Inventors: Won Suk You, Chengkun Zhang, Huan Tan, Youjun Xiong
  • Patent number: 11325264
    Abstract: A robotic hand includes a baseplate, a finger having multiple phalanges that are rotatably coupled to one another, a first of the phalanges having a first end rotatably coupled to the baseplate and a second end and a second of the phalanges rotatably coupled to the second end about an axis of rotation, an actuating mechanism mounted on the baseplate, the actuating mechanism configured to actuate rotation of the plurality of phalanges, and a tendon having opposite ends that are respectively attached to the second of the phalanges and the baseplate. The second of the phalanges has an engagement portion arranged around the axis of rotation, and the tendon is wrapped around a portion of the engagement portion to generate a force acting on the second end of the first of the phalanges, causing the first of the phalanges to rotate from a flexed state to an extended state.
    Type: Grant
    Filed: November 12, 2020
    Date of Patent: May 10, 2022
    Assignees: UBTECH NORTH AMERICA RESEARCH AND DEVELOPMENT CENTER CORP, UBTECH ROBOTICS CORP LTD
    Inventors: Won Suk You, Chengkun Zhang, Huan Tan, Youjun Xiong
  • Patent number: 11325247
    Abstract: The present disclosure provides a robotic arm control method as well as an apparatus and a terminal device using the same. The method includes: obtaining a current joint angle of each of M joints of the robotic arm; obtaining a reference included angle based on the current joint angle of each of the M joints of the robotic arm; determining an expected included angle corresponding to the robotic arm within a target angle range based on the reference included angle and the preset included angle related evaluation function; and controlling the robotic arm based on the target joint angles of the M joints.
    Type: Grant
    Filed: March 12, 2020
    Date of Patent: May 10, 2022
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Zecai Lin, Zhaohui An, Yizhang Liu, Meihui Zhang, Youjun Xiong, Jianxin Pang
  • Publication number: 20220139027
    Abstract: A scene data obtaining method as well as a model training method and a computer readable storage medium using the same are provided. The method includes: building a virtual simulation scene corresponding to an actual scene, where the virtual simulation scene is three-dimensional; determining a view frustum corresponding to preset view angles in the virtual simulation scene; collecting one or more two-dimensional images in the virtual simulation scene and ground truth object data associated with the one or more two-dimensional images using the view frustum corresponding to the preset view angles; and using all the two-dimensional images and the ground truth object data associated with the one or more two-dimensional images as scene data corresponding to the actual scene. In this manner, the data collection does not require manual annotation, and the obtained data can be used for training deep learning-based perceptual models.
    Type: Application
    Filed: March 30, 2021
    Publication date: May 5, 2022
    Inventors: Xi Luo, Mingguo Zhao, Youjun Xiong
  • Patent number: 11320088
    Abstract: A display stand includes a frame, an actuated rotary mechanism coupled to the frame, a counterweight coupled to the actuated rotary mechanism and movable in a vertical direction during rotation of the display about the axis of rotation, and an elevation mechanism. The actuated rotary mechanism includes a display holder that is configured to mount a display to the frame and rotate the display about an axis of rotation. The counterweight and the display are located at opposite sides of a vertical plane that passes through the axis of rotation such that a combined center of mass of the display, the display holder, and the counterweight lies on the vertical plane. The frame is coupled to the elevation mechanism, and the elevation mechanism is configured to move the frame up and down.
    Type: Grant
    Filed: December 7, 2020
    Date of Patent: May 3, 2022
    Assignees: UBTECH NORTH AMERICA RESEARCH AND DEVELOPMENT CENTER CORP, UBTECH ROBOTICS CORP LTD
    Inventors: Houzhu Ding, Chengkun Zhang, Huan Tan, Youjun Xiong