Patents by Inventor Huan Tan

Huan Tan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12384037
    Abstract: A control system includes a task manager that can determine capability requirements to perform a task on the target object. The task has an associated series of sub-tasks, with the sub-tasks having one or more capability requirements. The task manager selects and assigns a first sequence of sub-tasks to a first robotic machine that has a first set of capabilities and operates according to a first mode of operation. The task manager selects and assigns a second sequence of sub-tasks to a second robotic machine that has a second set of capabilities and operates according to a second mode of operation. The task manager selects the first robotic machine based on the first set of capabilities and the first mode of operation of the first robotic machine, and selects the second robotic machine based on the second set of capabilities and the second mode of operation.
    Type: Grant
    Filed: April 22, 2022
    Date of Patent: August 12, 2025
    Assignee: Transportation IP Holdings, LLC
    Inventors: Huan Tan, John Michael Lizzi, Charles Burton Theurer, Balajee Kannan, Romano Patrick, Mark Bachman, Michael VanderLinden, Mark Bradshaw Kraeling, Norman Wellings
  • Patent number: 12364780
    Abstract: A navigation method for a disinfection robot includes: obtaining an initial map within a field of view of the robot; generating a disinfection grid having multiple disinfection squares on the initial map, and determining multiple disinfecting consideration points in the disinfection squares; determining a disinfection order of disinfecting targets corresponding to the disinfection squares according to distances between the robot after the robot completes disinfection of one of the targets corresponding to one of the disinfection squares and to-be-disinfected ones of the disinfection squares; according to distances between the position and the disinfecting consideration points in to-be-disinfected ones of the disinfection squares, combined with priority levels of the disinfecting consideration points in to-be-disinfected ones of the disinfection squares, determining disinfecting planning points in the disinfection squares; and performing disinfection to the targets corresponding to the disinfection squares acc
    Type: Grant
    Filed: March 13, 2023
    Date of Patent: July 22, 2025
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Zhanjia Bi, Youjun Xiong, Huan Tan
  • Publication number: 20250224235
    Abstract: Path planning for mobile machine in large scale navigation disclosed. A path for moving a mobile machine is planned by: determining a start map node in a map graph based on a start point in the path and a goal map node in the map graph based on a goal point in the path; determining whether the start map node and the goal map node correspond to the same submap; and if so, planning the path between the start point and the goal point using a real-time path planning method; otherwise, obtaining the path between the start point and the goal point by merging a node path between the start map node and the goal map node, a first real-time path between the start point and a first stop point, and a second real-time path between the goal point and a last stop point.
    Type: Application
    Filed: January 9, 2024
    Publication date: July 10, 2025
    Inventors: Huaguang Du, Zhipeng Liu, Fangyun Zhao, Chengkun Zhang, Huan Tan
  • Publication number: 20250218210
    Abstract: An object tracking method, and a terminal device and a computer-readable storage medium using the same are provided. The method includes: obtaining first feature information of a target human body in a first image and a first detection frame of a head area of the target human body; obtaining second feature information of each human object in a second image and a second detection frame of a head area of the human object by performing a first image detection on the second image; and recognizing the target human body from the human object in the second image according to a first similarity between the first feature information and the second feature information and a second similarity between the first detection frame and the second detection frame. The above-mentioned method can effectively improve the accuracy of object matching, thereby enhancing the reliability of the results of multi-object tracking.
    Type: Application
    Filed: November 29, 2024
    Publication date: July 3, 2025
    Inventors: SHUPING HU, Kan Wang, Pei Dong, Jianxin Pang, Huan Tan
  • Publication number: 20250214251
    Abstract: A palletizing method includes: detecting, by one or more depth cameras, a stacking pattern on a pallet; generating a height map matrix based on the stacking pattern, wherein a value at each element of a plurality of elements in the height map matrix indicates a height of the stacking pattern at a position on the pallet corresponding to the element; for each box of the one or more boxes: traversing the elements in the height map matrix to obtain a mask matrix generated based on hypothetical situations that a target vertex of the box rests on a position corresponding to each of the elements; determining reward function values corresponding to one or more of elements in the mask matrix with an element value of 1; determining a box number and a box placement posture corresponding to a largest reward function value of the reward function values.
    Type: Application
    Filed: September 26, 2024
    Publication date: July 3, 2025
    Inventors: Wenzhou Li, Chunyu Chen, Mingqi Gao, Pengyu Zou, Pei Dong, Huan Tan
  • Publication number: 20250209639
    Abstract: An object tracking method, and a terminal device and a computer-readable storage medium using the same are provided. The method includes: obtaining a first filtered image by filtering out the moving object in the i-th image frame, where the moving object is an object in the i-th image frame that has a positional change relative to the object in the (i?1)-th image frame, and i is an integer larger than 1; determining, based on the first filtered image, a pixel mapping relationship between the (i?1)-th image frame and the i-th image frame; and tracking, according to the pixel mapping relationship, the moving object. Through the above-mentioned method, the reliability of the trajectory matching results can be improved, thereby improving the reliability of object tracking.
    Type: Application
    Filed: December 5, 2024
    Publication date: June 26, 2025
    Inventors: SHUPING HU, Kan Wang, Pei Dong, Jianxian Pang, Huan Tan
  • Publication number: 20250209798
    Abstract: A method and an apparatus for training target detection models and an electronic device are provided. The method includes: predicting unlabeled training data through a teacher model and a student model to obtain a first prediction result output by the teacher model and a second prediction result output by the student model; determining a target pseudo-label category to which the first prediction result belongs according to the confidence in the first prediction result; calculating a current pseudo-label loss based on the first prediction result, the second prediction result, and the pseudo-label loss function corresponding to the target pseudo-label category; and updating the student model according to the current pseudo-label loss and updating the teacher model based on the updated student model, and returning to predicting the unlabeled training data through the teacher model and the student model until a preset training end condition is met.
    Type: Application
    Filed: July 24, 2024
    Publication date: June 26, 2025
    Inventors: PENGYU ZOU, Shuping Hu, Kan Wang, Pei Dong, Jianxin Pang, Huan Tan
  • Patent number: 12314051
    Abstract: A method for navigating a robot through a limited space includes: obtaining a planned first path for the robot to reach a target position; calculating curvature and/or normal vectors of points in the first path; according to the curvature and a preset curvature threshold, and/or according to the normal vectors and a preset normal vector change threshold, obtaining a first straight path for the robot outside the limited space by curve fitting; determining an intersection of a centerline of the limited space and the first straight path, and determining a second straight path for the robot to move through the limited space according to the centerline and the intersection; and generating a second path for the robot to move through the limited space based on the first straight path and the second straight path, and navigating the robot to move through the limited space according to the second path.
    Type: Grant
    Filed: December 12, 2022
    Date of Patent: May 27, 2025
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Dunhao Liu, Youjun Xiong, Huan Tan
  • Patent number: 12179364
    Abstract: A motion trajectory planning method for a robotic manipulator having a visual inspection system, includes: in response to a command instruction, obtaining environmental data collected by the visual inspection system; determining an initial DS model motion trajectory of the robotic manipulator according to the command instruction, the environmental data, and a preset teaching motion DS model library, wherein the teaching motion DS model library includes at least one DS model motion trajectory generated based on human teaching activities; and at least based on a result of determining whether there is an obstacle, whose pose is on the initial DS model motion trajectory, in a first object included in the environmental data, correcting the initial DS model motion trajectory to obtain a desired motion trajectory of the robotic manipulator.
    Type: Grant
    Filed: December 31, 2021
    Date of Patent: December 31, 2024
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Dake Zheng, Yizhang Liu, Jianxin Pang, Huan Tan, Youjun Xiong
  • Patent number: 12172325
    Abstract: A collision detection method, a storage medium, and a robot are provided. The method includes: calculating an external torque of a first joint of the robot based on a preset generalized momentum-based disturbance observer; calculating an external torque of a second joint of the robot based on a preset long short-term memory network; calculating an external torque of a third joint of the robot based on the external torque of the first joint and the external torque of the second joint; and determining whether the robot has collided with an external environment or not based on the external torque of the third joint and a preset collision threshold. In the present disclosure, the component of the model error in the joint external torque calculated by the disturbance observer is eliminated to obtain the accurate contact torque, thereby improving the accuracy of the collision detection.
    Type: Grant
    Filed: December 6, 2022
    Date of Patent: December 24, 2024
    Assignee: UBTECH ROBOTICS CORP LTD
    Inventors: Meihui Zhang, Yizhang Liu, Youjun Xiong, Huan Tan
  • Publication number: 20240416177
    Abstract: A user exercise detection method applicable in a robot includes: obtaining first measurement data from at least one inertial measurement unit (IMU) sensor that is arranged at a designated body part of the user, and detecting a posture of the user relative to the robot based on the first measurement data; obtaining second measurement data from the at least one IMU sensor, and determining whether an exercise of the user corresponding to the posture is detected according to a preset threshold parameter and the second measurement data; in response to detection of the exercise, obtaining exercise data when the user performs the exercise multiple times through the at least one IMU sensor; and adjusting the threshold parameter according to the exercise data.
    Type: Application
    Filed: June 13, 2023
    Publication date: December 19, 2024
    Inventors: Sheryl Suet Ying Chau, Robert Malate, Chengkun Zhang, Huan Tan
  • Patent number: 12099371
    Abstract: Embodiments of the disclosure provide methods and systems for continuous regulation of a nonholonomic mobile robot. An exemplary method may include identifying a current pose of the nonholonomic mobile robot in a world frame, where the current pose is represented by a first set of values defining a first set of states of the nonholonomic mobile robot in the world frame; receiving a final goal pose of the nonholonomic mobile robot, where the final goal pose is represented by a second set of values defining a second set of states of nonholonomic mobile robot in the world frame; determining a moving path for moving the nonholonomic mobile robot from the current pose to the final goal pose; and controlling the nonholonomic mobile robot to move from the current pose to the final goal pose according to the moving path, where the nonholonomic mobile robot moves to the final goal pose by converging the nonholonomic mobile robot from the first set of states to the second set of states simultaneously.
    Type: Grant
    Filed: June 4, 2021
    Date of Patent: September 24, 2024
    Assignee: UBKANG (QINGDAO) TECHNOLOGY CO., LTD.
    Inventors: Dejun Guo, Huan Tan
  • Patent number: 12093054
    Abstract: A moving target following method, which is executed by one or more processors of a robot that includes a camera and a sensor electrically coupled to the one or more processors, includes: performing a body detection to a body of a target based on images acquired by the camera to obtain a body detection result; performing a leg detection to legs of the target based on data acquired by die sensor to obtain a leg detection result; and fusing the body detection result and the leg detection result to obtain a fusion result, and controlling the robot to follow the target based on the fusion result.
    Type: Grant
    Filed: April 25, 2021
    Date of Patent: September 17, 2024
    Assignee: UBKANG (QINGDAO) TECHNOLOGY CO., LTD.
    Inventors: Dejun Guo, Ting-Shuo Chou, Yang Shen, Huan Tan
  • Publication number: 20240290097
    Abstract: A method for extracting video features may include: obtaining a target video sequence that comprises a number of video frames; performing video frame feature extraction on the target video sequence to obtain video frame features of each of the video frames; performing feature weight calculation on each of the video frame features to obtain the feature weight of each of the video frame features; wherein the feature weight of each of the video frame features is determined by the video frame features of all of the video frames in the target video sequence; and performing feature weighting on each of the video frame features according to the feature weight of each of the video frame features to obtain video features of the target video sequence.
    Type: Application
    Filed: January 30, 2024
    Publication date: August 29, 2024
    Inventors: Kan Wang, Jianxin Pang, Huan Tan
  • Publication number: 20240290096
    Abstract: A method for extracting video frame features includes: obtaining a number of initial features of each video frame in a video sequence; calculating global channel attention information of the video sequence based on the initial features of each video frame in the video sequence; calculating local channel attention information of a target video frame according to initial features of a target video frame; wherein the target video frame is one of the video frames in the video sequence; and performing channel attention mechanism processing on the initial features of the target video frame according to the global channel attention information and the local channel attention information to obtain optimized features of the target video frame.
    Type: Application
    Filed: January 25, 2024
    Publication date: August 29, 2024
    Inventors: Kan Wang, Shuping Hu, Jianxin Pang, Huan Tan
  • Patent number: 12051263
    Abstract: Human lying posture detections are disclosed. A human lying on a bed is detected by obtaining an image through a depth camera, detecting objects in the image and marking the objects in the image using 2D bounding boxes by deep learning, determining the human being in a lying posture in response to a width and a height of the 2D bounding box of the human meeting a predetermined condition, detecting one or more skin areas in the image and generating skin area 2D bounding boxes to mark each of the one or more skin areas using a skin detection algorithm, and determining the human being in the lying posture in response to the skin area 2D bounding boxes and the 2D bounding box of the bed meeting a predetermined positional relationship.
    Type: Grant
    Filed: June 30, 2021
    Date of Patent: July 30, 2024
    Assignees: FUTRONICS (NA) CORPORATION, UBTECH ROBOTICS CORP LTD
    Inventors: Chuqiao Dong, Dan Shao, Zhen Xiu, Dejun Guo, Huan Tan
  • Patent number: 12046231
    Abstract: A method for facilitating a multiparty conversation is disclosed. An electronic device using the method may facilitate a multiparty conversation by identifying participants of a conversation, localizing relative positions of the participants, detecting speeches of the conversation, matching one of the participants to each of the detected speeches according to the relative positions of the participants, counting participations of the matched participant in the conversation, identifying a passive subject from all the participants according to the participations of all the participants in the conversation, finding a topic of the conversation between the participants, and engaging the passive subject by addressing the passive subject and speaking a sentence related to the topic.
    Type: Grant
    Filed: August 5, 2021
    Date of Patent: July 23, 2024
    Assignee: UBKANG (QINGDAO) TECHNOLOGY CO., LTD.
    Inventors: David Ayllón Álvarez, Adam David King, Zhen Xiu, Huan Tan
  • Patent number: 12030191
    Abstract: A vision-guided picking and placing method for a mobile robot that has a manipulator having a hand and a camera, includes: receiving a command instruction that instructs the mobile robot to grasp a target item among at least one object; controlling the mobile robot to move to a determined location, controlling the manipulator to reach for the at least one object, and capturing one or more images of the at least one object using the camera; extracting visual feature data from the one or more images, matching the extracted visual feature data to preset feature data of the target item to identify the target item, and determining a grasping position and a grasping vector of the target item; and controlling the manipulator and the hand to grasp the target item according to the grasping position and the grasping vector, and placing the target item to a target position.
    Type: Grant
    Filed: October 28, 2021
    Date of Patent: July 9, 2024
    Assignee: UBKANG (QINGDAO) TECHNOLOGY CO., LTD.
    Inventors: Dan Shao, Yang Shen, Fei Long, Jiexin Cai, Huan Tan
  • Patent number: 12032377
    Abstract: Navigation of a mobility aid robot having a camera and gripping part(s) disposed toward different directions is disclosed. The mobility aid robot is navigated to approach a user by identifying a posture of the user through the camera, determining a mode of the robot according to a type of the specified task to be performed on the user and the identified posture of the user, controlling the robot to move according to a planned trajectory corresponding to the determined mode of the robot, and turning the robot upon reaching the desired pose such that the gripping part faces the user, in response to the determined mode of the robot corresponding to the specified task of an assisting type and the user at one of a standing posture and a sitting posture.
    Type: Grant
    Filed: May 25, 2021
    Date of Patent: July 9, 2024
    Assignee: UBKANG (QINGDAO) TECHNOLOGY CO., LTD.
    Inventors: Dejun Guo, Aravind Sreejith, Chuqiao Dong, Dan Shao, Zhen Xiu, Huan Tan
  • Patent number: D1038846
    Type: Grant
    Filed: July 14, 2023
    Date of Patent: August 13, 2024
    Assignees: UBTECH ROBOTICS CORP LTD, FUTRONICS (NA) CORPORATION
    Inventors: Sichao Zhong, Qianshan Li, Chengkun Zhang, Kun Xie, Huan Tan