Patents by Inventor Yan-Yi DU

Yan-Yi DU has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11247340
    Abstract: This disclosure is related to a non-contact tool center point calibration method for a robot arm, and the method comprises: obtaining a coordinate transformation relationship between a flange surface of the robot arm and cameras by a hand-eye calibration algorithm; constructing a space coordinate system by a stereoscopic reconstruction method; actuating a replaceable member fixed with the flange surface to present postures in a union field of view of the cameras sequentially, recording feature coordinates of the replaceable member in the space coordinate system, and recording flange surface coordinates which is under the postures in the space coordinate system; obtaining a transformation relationship between a tool center point and the flange surface; and updating the transformation relationship into a control program of the robot arm. Moreover, the disclosure further discloses a calibration device performing the calibration method and a robot arm system having the calibration function.
    Type: Grant
    Filed: December 26, 2018
    Date of Patent: February 15, 2022
    Assignee: INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE
    Inventors: Sheng Chieh Hsu, Hao Hsiang Yang, Shu Huang, Yan Yi Du
  • Patent number: 10737387
    Abstract: A robot arm calibration device is provided, which includes a light emitter, a light sensing module, a cooperative motion controller and a processing module. The light emitter is disposed on at least one robot arm to emit a light beam. The light sensing module is disposed on at least another robot arm to receive the light beam and the light beam is converted into a plurality of image data. The cooperative motion controller is configured to drive the light emitter and light sensing module on at least two robot arms to a corrected position and a position to be corrected, respectively. The processing module receives the image data and the motion parameters of the at least two robot arms to calculate an error value between the corrected position and the position to be corrected, and analyzes the image data to output a corrected motion parameter for modifying motion command.
    Type: Grant
    Filed: March 26, 2018
    Date of Patent: August 11, 2020
    Assignee: INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE
    Inventors: Yan-Yi Du, Cheng-Chuan Chao, Shu Huang, Hung-Hsiu Yu
  • Publication number: 20200198145
    Abstract: This disclosure is related to a non-contact tool center point calibration method for a robot arm, and the method comprises: obtaining a coordinate transformation relationship between a flange surface of the robot arm and cameras by a hand-eye calibration algorithm; constructing a space coordinate system by a stereoscopic reconstruction method; actuating a replaceable member fixed with the flange surface to present postures in a union field of view of the cameras sequentially, recording feature coordinates of the replaceable member in the space coordinate system, and recording flange surface coordinates which is under the postures in the space coordinate system; obtaining a transformation relationship between a tool center point and the flange surface; and updating the transformation relationship into a control program of the robot arm. Moreover, the disclosure further discloses a calibration device performing the calibration method and a robot arm system having the calibration function.
    Type: Application
    Filed: December 26, 2018
    Publication date: June 25, 2020
    Applicant: INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE
    Inventors: Sheng Chieh HSU, Hao Hsiang YANG, Shu HUANG, Yan Yi DU
  • Publication number: 20190168385
    Abstract: A robot arm calibration device is provided, which includes a light emitter, a light sensing module, a cooperative motion controller and a processing module. The light emitter is disposed on at least one robot arm to emit a light beam. The light sensing module is disposed on at least another robot arm to receive the light beam and the light beam is converted into a plurality of image data. The cooperative motion controller is configured to drive the light emitter and light sensing module on at least two robot arms to a corrected position and a position to be corrected, respectively. The processing module receives the image data and the motion parameters of the at least two robot arms to calculate an error value between the corrected position and the position to be corrected, and analyzes the image data to output a corrected motion parameter for modifying motion command.
    Type: Application
    Filed: March 26, 2018
    Publication date: June 6, 2019
    Inventors: Yan-Yi DU, Cheng-Chuan CHAO, Shu HUANG, Hung-Hsiu YU