Patents by Inventor Kaimeng Wang

Kaimeng Wang has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240109181
    Abstract: A technique for robotic grasp teaching by human demonstration. A human demonstrates a grasp on a workpiece, while a camera provides images of the demonstration which are analyzed to identify a hand pose relative to the workpiece. The hand pose is converted to a plane representing two fingers of a gripper. The hand plane is used to determine a grasp region on the workpiece which corresponds to the human demonstration. The grasp region and the hand pose are used in an optimization computation which is run repeatedly with randomization to generate multiple grasps approximating the demonstration, where each of the optimized grasps is a stable, high quality grasp with gripper-workpiece surface contact. A best one of the generated grasps is then selected and added to a grasp database. The human demonstration may be repeated on different locations of the workpiece to provide multiple different grasps in the database.
    Type: Application
    Filed: September 23, 2022
    Publication date: April 4, 2024
    Inventors: Kaimeng Wang, Yongxiang Fan
  • Patent number: 11813749
    Abstract: A method for teaching a robot to perform an operation based on human demonstration with images from a camera. The method includes a teaching phase where a 2D or 3D camera detects a human hand grasping and moving a workpiece, and images of the hand and workpiece are analyzed to determine a robot gripper pose and positions which equate to the pose and positions of the hand and corresponding pose and positions of the workpiece. Robot programming commands are then generated from the computed gripper pose and position relative to the workpiece pose and position. In a replay phase, the camera identifies workpiece pose and position, and the programming commands cause the robot to move the gripper to pick, move and place the workpiece as demonstrated. A teleoperation mode is also disclosed, where camera images of a human hand are used to control movement of the robot in real time.
    Type: Grant
    Filed: April 8, 2020
    Date of Patent: November 14, 2023
    Assignee: FANUC CORPORATION
    Inventors: Kaimeng Wang, Tetsuaki Kato
  • Publication number: 20230294291
    Abstract: A method for line matching during image-based visual servoing control of a robot performing a workpiece installation. The method uses a target image from human demonstration and a current image of a robotic execution phase. A plurality of lines are identified in the target and current images, and an initial pairing of target-current lines is defined based on distance and angle. An optimization computation determines image transposes which minimize a cost function formulated to include both direction and distance between target lines and current lines using 2D data in the camera image plane, and constraint equations which relate the lines in the image plane to the 3D workpiece pose. The rotational and translational transposes which minimize the cost function are used to update the line pair matching, and the best line pairs are used to compute a difference signal for controlling robot motion during visual servoing.
    Type: Application
    Filed: March 15, 2022
    Publication date: September 21, 2023
    Inventors: Kaimeng Wang, Yongxiang Fan
  • Patent number: 11712797
    Abstract: A method for dual hand detection in robot teaching from human demonstration. A camera image of the demonstrator's hands and workpieces is provided to a first neural network which determines the identity of the left and right hand of the human demonstrator from the image, and also provides cropped sub-images of the identified hands. The first neural network is trained using images in which the left and right hands are pre-identified. The cropped sub-images are then provided to a second neural network which detects the pose of both the left and right hand from the images, where the sub-image for the left hand is horizontally flipped before and after the hand pose detection if second neural network is trained with right hand images. The hand pose data is converted to robot gripper pose data and used for teaching a robot to perform an operation through human demonstration.
    Type: Grant
    Filed: September 11, 2020
    Date of Patent: August 1, 2023
    Assignee: FANUC CORPORATION
    Inventors: Kaimeng Wang, Tetsuaki Kato
  • Publication number: 20230173660
    Abstract: A method for teaching and controlling a robot to perform an operation based on human demonstration with images from a camera. The method includes a demonstration phase where a camera detects a human hand grasping and moving a workpiece to define a rough trajectory of the robotic movement of the workpiece. Line features or other geometric features on the workpiece collected during the demonstration phase are used in an image-based visual servoing (IBVS) approach which refines a final placement position of the workpiece, where the IBVS control takes over the workpiece placement during the final approach by the robot. Moving object detection is used for automatically localizing both object and hand position in 2D image space, and then identifying line features on the workpiece by removing line features belonging to the hand using hand keypoint detection.
    Type: Application
    Filed: December 6, 2021
    Publication date: June 8, 2023
    Inventors: Kaimeng Wang, Tetsuaki Kato
  • Publication number: 20230120598
    Abstract: A method for teaching a robot to perform an operation based on human demonstration using force and vision sensors. The method includes a vision sensor to detect position and pose of both the human's hand and optionally a workpiece during teaching of an operation such as pick, move and place. The force sensor, located either beneath the workpiece or on a tool, is used to detect force information. Data from the vision and force sensors, along with other optional inputs, are used to teach both motions and state change logic for the operation being taught. Several techniques are disclosed for determining state change logic, such as the transition from approaching to grasping. Techniques for improving motion programming to remove extraneous motions by the hand are also disclosed. Robot programming commands are then generated from the hand position and orientation data, along with the state transitions.
    Type: Application
    Filed: October 15, 2021
    Publication date: April 20, 2023
    Inventors: Kaimeng Wang, Tetsuaki Kato
  • Publication number: 20220080580
    Abstract: A method for dual hand detection in robot teaching from human demonstration. A camera image of the demonstrator's hands and workpieces is provided to a first neural network which determines the identity of the left and right hand of the human demonstrator from the image, and also provides cropped sub-images of the identified hands. The first neural network is trained using images in which the left and right hands are pre-identified. The cropped sub-images are then provided to a second neural network which detects the pose of both the left and right hand from the images, where the sub-image for the left hand is horizontally flipped before and after the hand pose detection if second neural network is trained with right hand images. The hand pose data is converted to robot gripper pose data and used for teaching a robot to perform an operation through human demonstration.
    Type: Application
    Filed: September 11, 2020
    Publication date: March 17, 2022
    Inventors: Kaimeng Wang, Tetsuaki Kato
  • Publication number: 20220080581
    Abstract: A method for dual arm robot teaching from dual hand detection in human demonstration. A camera image of the demonstrator's hands and workpieces is provided to a first neural network which determines the identity of the left and right hand from the image, and also provides cropped sub-images of the identified hands. The cropped sub-images are provided to a second neural network which detects the poses of both the left and right hand from the images. The dual hand pose data for an entire operation is converted to robot gripper pose data and used for teaching two robot arms to perform the operation on the workpieces, where each hand's motion is assigned to one robot arm. Edge detection from camera images may be used to refine robot motions in order to improve part localization for tasks requiring precision, such as inserting a part into an aperture.
    Type: Application
    Filed: October 15, 2021
    Publication date: March 17, 2022
    Inventors: Kaimeng Wang, Tetsuaki Kato
  • Patent number: 11207788
    Abstract: A hand control apparatus including an extracting unit extracting a grip pattern of an object having a shape closest to that of the object acquired by a shape acquiring unit from a storage unit storing and associating shapes of plural types of objects and grip patterns, a position and posture calculating unit calculating a gripping position and posture of the hand in accordance with the extracted grip pattern, a hand driving unit causing the hand to grip the object based on the calculated gripping position and posture, a determining unit determining if a gripped state of the object is appropriate based on information acquired by at least one of the shape acquiring unit, a force sensor and a tactile sensor, and a gripped state correcting unit correcting at least one of the gripping position and the posture when it is determined that the gripped state of the object is inappropriate.
    Type: Grant
    Filed: February 4, 2019
    Date of Patent: December 28, 2021
    Assignee: FANUC CORPORATION
    Inventors: Wenjie Chen, Tetsuaki Kato, Kaimeng Wang
  • Publication number: 20210316449
    Abstract: A method for teaching a robot to perform an operation based on human demonstration with images from a camera. The method includes a teaching phase where a 2D or 3D camera detects a human hand grasping and moving a workpiece, and images of the hand and workpiece are analyzed to determine a robot gripper pose and positions which equate to the pose and positions of the hand and corresponding pose and positions of the workpiece. Robot programming commands are then generated from the computed gripper pose and position relative to the workpiece pose and position. In a replay phase, the camera identifies workpiece pose and position, and the programming commands cause the robot to move the gripper to pick, move and place the workpiece as demonstrated. A teleoperation mode is also disclosed, where camera images of a human hand are used to control movement of the robot in real time.
    Type: Application
    Filed: April 8, 2020
    Publication date: October 14, 2021
    Inventors: Kaimeng Wang, Tetsuaki Kato
  • Patent number: 11130236
    Abstract: A robot movement teaching apparatus including a movement path extraction unit configured to process time-varying images of a first workpiece and fingers or arms of a human working on the first workpiece, and thereby extract a movement path of the fingers or arms of the human; a mapping generation unit configured to generate a transform function for transformation from the first workpiece to a second workpiece worked on by a robot, based on feature points of the first workpiece and feature points of the second workpiece; and a movement path generation unit configured to generate a movement path of the robot based on the movement path of the fingers or arms of the human extracted by the movement path extraction unit and based on the transform function generated by the mapping generation unit.
    Type: Grant
    Filed: March 6, 2019
    Date of Patent: September 28, 2021
    Assignee: FANUC CORPORATION
    Inventors: Wenjie Chen, Kaimeng Wang, Tetsuaki Kato
  • Patent number: 11000949
    Abstract: A control device includes a learning control part in which a difference is calculated between a target position and an actual position of a portion detected based on a sensor, and an operation-speed change rate is increased or reduced several times within a maximum value of the operation-speed change rate set for increasing or reducing the operation speed of a robot mechanism unit and within allowance conditions of vibrations occurring at the portion to be controlled; meanwhile, learning is repeated to calculate an updated compensation amount based on the difference and a previous compensation amount previously calculated for suppressing vibrations at each operation-speed change rate, and a convergent compensation amount and a convergent operation-speed change rate are stored after convergence of the compensation amount and the operation-speed change rate.
    Type: Grant
    Filed: February 21, 2018
    Date of Patent: May 11, 2021
    Assignee: FANUC CORPORATION
    Inventors: Shinichi Washizu, Hajime Suzuki, Kaimeng Wang
  • Patent number: 10994422
    Abstract: A robot system that performs desired processing on a processing target object using a processing tool. The robot system includes a robot having an arm tip that holds the processing tool, a position detector that detects a position of the arm tip, and a robot controller that controls an operation of the robot based on a position command and a position feedback detected by the position detector. The robot controller includes an adjustment operation creating unit that, during adjustment of operation parameters for controlling the operation of the robot, acquires an application and an operation area of the robot and automatically creates an adjustment operation corresponding to the acquired application and the operation area and a parameter adjustment unit that automatically adjusts the operation parameters during execution of the adjustment operation created by the adjustment operation creating unit so that a performance required for the application is satisfied.
    Type: Grant
    Filed: November 27, 2018
    Date of Patent: May 4, 2021
    Assignee: FANUC CORPORATION
    Inventors: Hajime Suzuki, Shuusuke Watanabe, Kaimeng Wang
  • Patent number: 10814485
    Abstract: A device that can prevent a decrease in an efficiency of a manufacturing line. The device includes a shape acquisition section for acquiring a shape of a workpiece; a motion pattern acquisition section for acquiring basic motion patterns including a reference workpiece shape, a reference working position in the reference workpiece shape, and a type of an operation carried out on the reference working position; a similarity determination section for determining whether a shape of the workpiece is similar to the reference work piece shape; a position determination section for, based on a shape of the workpiece and the reference workpiece shape, determining the working position on the workpiece that corresponds to the reference working position; and an motion-path generation section for, by changing the reference working position to the determined working position, generating a motion path.
    Type: Grant
    Filed: April 6, 2018
    Date of Patent: October 27, 2020
    Assignee: Fanuc Corporation
    Inventors: Kaimeng Wang, Wenjie Chen, Kouichirou Hayashi
  • Patent number: 10737384
    Abstract: A robot system includes a light source, an image capture device, a robot mechanism unit having a target site of position control where the light source is provided, and a robot controller that controls the position of the robot mechanism unit based on a position command, a position feedback, and a position compensation value. The robot controller includes a path acquisition unit that makes the image capture device capture an image of light from the light source continuously during the predetermined operation to acquire a path of the light source from the image capture device, a positional error estimation unit that estimates positional error of the path of the light source from the position command based on the acquired path of the light source and the position command, and a compensation value generation unit that generates the position compensation value based on the estimated positional error.
    Type: Grant
    Filed: August 13, 2018
    Date of Patent: August 11, 2020
    Assignee: FANUC CORPORATION
    Inventors: Nobuaki Yamaoka, Hajime Suzuki, Kaimeng Wang
  • Patent number: 10646995
    Abstract: A control device repeats learning of: calculating an allowable condition for speed variations during a processing operation based on an allowable condition for processing error; setting an operating speed change rate used to increase or reduce an operating speed of a robot mechanism unit using a calculated allowable condition for speed variations; and, while increasing or reducing the operating speed change rate over a plurality of repetitions within a range not exceeding a maximum value of the operating speed change rate and within a range of an allowable condition for vibrations occurring in a control target, calculates a new correction amount based on an amount of difference between a position of the control target detected based on a sensor and a target position, and a previously-calculated correction amount.
    Type: Grant
    Filed: June 26, 2018
    Date of Patent: May 12, 2020
    Assignee: FANUC CORPORATION
    Inventors: Shinichi Washizu, Hajime Suzuki, Kaimeng Wang
  • Patent number: 10618164
    Abstract: A robot system is provide with a robot control device that includes an operation control unit and a learning control unit. The learning control unit performs a learning control in which a vibration correction amount for correcting a vibration generated at a control target portion of a robot is calculated and the vibration correction amount is employed in the operation command at a next time. The learning control unit includes a plurality of learning control parts for calculating the vibration correction amount and a selection unit that selects one of the plurality of learning control parts on the basis of operation information of the robot when the robot is made to be operated by an operation program that is a target of the learning control.
    Type: Grant
    Filed: February 9, 2018
    Date of Patent: April 14, 2020
    Assignee: FANUC CORPORATION
    Inventors: Kaimeng Wang, Satoshi Inagaki, Wenjie Chen
  • Patent number: 10589429
    Abstract: Provided is a machine system including a machine including a movable part; a control device; a sensor detecting information about the movable part during a predetermined operation of the machine; a transmitting unit wirelessly transmitting the detected information during the predetermined operation; a receiving unit receiving the wirelessly transmitted information; a storage unit storing the received information; a detection unit detecting a loss in the received information; a command unit causing the machine to repeat the predetermined operation, in a case where a loss in the information is detected; a determination unit determining whether or not every lost part of the information detected first is contained in the information detected during the repeated operation; and an complementing unit ending the repeated operation in a case where every lost part is determined to be contained and complementing the information detected first with the information detected during the repeated operation.
    Type: Grant
    Filed: January 26, 2018
    Date of Patent: March 17, 2020
    Assignee: Fanuc Corporation
    Inventors: Kaimeng Wang, Nobuaki Yamaoka, Hajime Suzuki
  • Publication number: 20200023518
    Abstract: A robot system includes: a feature point position detection unit configured to detect, at a constant cycle, a position of a feature point of an obstacle that moves or deforms within a motion range of a robot; a movement path calculation unit configured to calculate a movement path of the robot before a motion of the robot; a mapping function derivation unit configured to derive a mapping function based on a position of the feature point that is detected at a time interval; and a path adjustment unit configured to dynamically adjust the movement path of the robot using the derived mapping function.
    Type: Application
    Filed: July 5, 2019
    Publication date: January 23, 2020
    Inventors: Kaimeng WANG, Wenjie CHEN, Tetsuaki KATO
  • Patent number: 10520912
    Abstract: A robot controller having a function that simplifies learning and a robot control method. The robot controller includes: a learning section configured to carry out learning of detecting a deviation between a commanded trajectory representing a position of the robot generated according to the command values and an operation trajectory representing an actual position where the robot has moved, and generate a corrected program by adjusting the commanded trajectory; a saving section configured to save the corrected program; and a relearning section configured to carry out relearning on a relearning location, the relearning location being a part of the operation trajectory designated by an operator.
    Type: Grant
    Filed: November 17, 2017
    Date of Patent: December 31, 2019
    Assignee: FANUC CORPORATION
    Inventors: Kaimeng Wang, Shuusuke Watanabe