Patents by Inventor Li Yang Ku

Li Yang Ku has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230321821
    Abstract: A method for object grasping, including: determining features of a scene; determining candidate grasp locations; determining a set of candidate grasp proposals for the candidate grasp locations; optionally modifying a candidate grasp proposal of the set; determining grasp scores associated with the candidate grasp proposals; selecting a set of final grasp proposals based on the grasp scores; and executing a grasp proposal from the set of final grasp proposals.
    Type: Application
    Filed: December 29, 2022
    Publication date: October 12, 2023
    Inventors: Li Yang Ku, Michael Stark, Ahmad Humayun, Nan Rong, Bhaskara Mannar Marthi
  • Publication number: 20230256601
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating grasps for objects within a container. One of the methods includes determining a set of grasp proposals with associated grasping windows, wherein each grasp proposal has a different respective position within a workspace. A respective set of waypoints is determined for each grasp proposal, each set comprising a pre-grasp pose and a grasp pose within the workspace. A final selected grasp proposal is used to control an end effector of a robot to grasp an object in the workspace based on a calculated grasp trajectory associated to the selected grasp proposal.
    Type: Application
    Filed: February 17, 2023
    Publication date: August 17, 2023
    Inventors: Li Yang Ku, Farzad Niroui, Nan Rong, Michael Stark, Bhaskara Mannar Marthi
  • Patent number: 11541534
    Abstract: A method for object grasping, including: determining features of a scene; determining candidate grasp locations; determining a set of candidate grasp proposals for the candidate grasp locations; optionally modifying a candidate grasp proposal of the set; determining grasp scores associated with the candidate grasp proposals; selecting a set of final grasp proposals based on the grasp scores; and executing a grasp proposal from the set of final grasp proposals.
    Type: Grant
    Filed: July 14, 2021
    Date of Patent: January 3, 2023
    Assignee: Intrinsic Innovation LLC
    Inventors: Li Yang Ku, Michael Stark, Ahmad Humayun, Nan Rong, Bhaskara Marthi
  • Publication number: 20220152819
    Abstract: A method for object grasping can include: generating a set of keypoints for one or more detected objects in a scene; subdividing the set of keypoints into subsets, each corresponding to a subregion of a detected object; determining a graspability score for the subregion; determining a grasp location for the subregion; selecting a candidate grasp location; and optionally grasping an object using the candidate grasp location.
    Type: Application
    Filed: February 1, 2022
    Publication date: May 19, 2022
    Inventors: Li Yang Ku, Nan Rong, Bhaskara Mannar Marthi
  • Patent number: 11273552
    Abstract: A method for object grasping can include: generating a set of keypoints for one or more detected objects in a scene; subdividing the set of keypoints into subsets, each corresponding to a subregion of a detected object; determining a graspability score for the subregion; determining a grasp location for the subregion; selecting a candidate grasp location; and optionally grasping an object using the candidate grasp location.
    Type: Grant
    Filed: July 14, 2021
    Date of Patent: March 15, 2022
    Assignee: Vicarious FPC, Inc.
    Inventors: Li Yang Ku, Nan Rong, Bhaskara Mannar Marthi
  • Publication number: 20220016765
    Abstract: A method for object grasping, including: determining features of a scene; determining candidate grasp locations; determining a set of candidate grasp proposals for the candidate grasp locations; optionally modifying a candidate grasp proposal of the set; determining grasp scores associated with the candidate grasp proposals; selecting a set of final grasp proposals based on the grasp scores; and executing a grasp proposal from the set of final grasp proposals.
    Type: Application
    Filed: July 14, 2021
    Publication date: January 20, 2022
    Inventors: Li Yang Ku, Michael Stark, Ahmad Humayun, Nan Rong, Bhaskara Marthi
  • Publication number: 20220016767
    Abstract: A method for object grasping can include: generating a set of keypoints for one or more detected objects in a scene; subdividing the set of keypoints into subsets, each corresponding to a subregion of a detected object; determining a graspability score for the subregion; determining a grasp location for the subregion; selecting a candidate grasp location; and optionally grasping an object using the candidate grasp location.
    Type: Application
    Filed: July 14, 2021
    Publication date: January 20, 2022
    Inventors: Li Yang Ku, Nan Rong, Bhaskara Mannar Marthi
  • Patent number: 9844881
    Abstract: A machine vision system for a controllable robotic device proximal to a workspace includes an image acquisition sensor arranged to periodically capture vision signal inputs each including an image of a field of view including the workspace. A controller operatively couples to the robotic device and includes a non-transitory memory component including an executable vision perception routine. The vision perception routine includes a focus loop control routine operative to dynamically track a focus object in the workspace and a background loop control routine operative to monitor a background of the workspace. The focus loop control routine executes simultaneously asynchronously in parallel with the background loop control routine to determine a combined resultant including the focus object and the background based upon the periodically captured vision signal inputs. The controller is operative to control the robotic device to manipulate the focus object based upon the focus loop control routine.
    Type: Grant
    Filed: June 22, 2015
    Date of Patent: December 19, 2017
    Assignee: GM Global Technology Operations LLC
    Inventors: David W. Payton, Kyungnam Kim, Zhichao Chen, Ryan M. Uhlenbrock, Li Yang Ku
  • Publication number: 20160368148
    Abstract: A machine vision system for a controllable robotic device proximal to a workspace includes an image acquisition sensor arranged to periodically capture vision signal inputs each including an image of a field of view including the workspace. A controller operatively couples to the robotic device and includes a non-transitory memory component including an executable vision perception routine. The vision perception routine includes a focus loop control routine operative to dynamically track a focus object in the workspace and a background loop control routine operative to monitor a background of the workspace. The focus loop control routine executes simultaneously asynchronously in parallel with the background loop control routine to determine a combined resultant including the focus object and the background based upon the periodically captured vision signal inputs. The controller is operative to control the robotic device to manipulate the focus object based upon the focus loop control routine.
    Type: Application
    Filed: June 22, 2015
    Publication date: December 22, 2016
    Applicant: GM GLOBAL TECHNOLOGY OPERATIONS LLC
    Inventors: David W. Payton, Kyungnam Kim, Zhichao Chen, Ryan M. Uhlenbrock, Li Yang Ku
  • Patent number: 9403273
    Abstract: A method of training a robot to autonomously execute a robotic task includes moving an end effector through multiple states of a predetermined robotic task to demonstrate the task to the robot in a set of n training demonstrations. The method includes measuring training data, including at least the linear force and the torque via a force-torque sensor while moving the end effector through the multiple states. Key features are extracted from the training data, which is segmented into a time sequence of control primitives. Transitions between adjacent segments of the time sequence are identified. During autonomous execution of the same task, a controller detects the transitions and automatically switches between control modes. A robotic system includes a robot, force-torque sensor, and a controller programmed to execute the method.
    Type: Grant
    Filed: May 23, 2014
    Date of Patent: August 2, 2016
    Assignee: GM Global Technology Operations LLC
    Inventors: David W. Payton, Ryan M. Uhlenbrock, Li Yang Ku
  • Patent number: 9387589
    Abstract: A robotic system includes a robot, sensors which measure status information including a position and orientation of the robot and an object within the workspace, and a controller. The controller, which visually debugs an operation of the robot, includes a simulator module, action planning module, and graphical user interface (GUI). The simulator module receives the status information and generates visual markers, in response to marker commands, as graphical depictions of the object and robot. An action planning module selects a next action of the robot. The marker generator module generates and outputs the marker commands to the simulator module in response to the selected next action. The GUI receives and displays the visual markers, selected future action, and input commands. Via the action planning module, the position and/or orientation of the visual markers are modified in real time to change the operation of the robot.
    Type: Grant
    Filed: February 25, 2014
    Date of Patent: July 12, 2016
    Assignee: GM Global Technology Operations LLC
    Inventors: Leandro G. Barajas, David W Payton, Li Yang Ku, Ryan M Uhlenbrock, Darren Earl
  • Publication number: 20150336268
    Abstract: A method of training a robot to autonomously execute a robotic task includes moving an end effector through multiple states of a predetermined robotic task to demonstrate the task to the robot in a set of n training demonstrations. The method includes measuring training data, including at least the linear force and the torque via a force-torque sensor while moving the end effector through the multiple states. Key features are extracted from the training data, which is segmented into a time sequence of control primitives. Transitions between adjacent segments of the time sequence are identified. During autonomous execution of the same task, a controller detects the transitions and automatically switches between control modes. A robotic system includes a robot, force-torque sensor, and a controller programmed to execute the method.
    Type: Application
    Filed: May 23, 2014
    Publication date: November 26, 2015
    Applicant: GM GLOBAL TECHNOLOGY OPERATIONS LLC
    Inventors: David W. Payton, Ryan M. Uhlenbrock, Li Yang Ku
  • Publication number: 20150239127
    Abstract: A robotic system includes a robot, sensors which measure status information including a position and orientation of the robot and an object within the workspace, and a controller. The controller, which visually debugs an operation of the robot, includes a simulator module, action planning module, and graphical user interface (GUI). The simulator module receives the status information and generates visual markers, in response to marker commands, as graphical depictions of the object and robot. An action planning module selects a next action of the robot. The marker generator module generates and outputs the marker commands to the simulator module in response to the selected next action. The GUI receives and displays the visual markers, selected future action, and input commands. Via the action planning module, the position and/or orientation of the visual markers are modified in real time to change the operation of the robot.
    Type: Application
    Filed: February 25, 2014
    Publication date: August 27, 2015
    Applicant: GM GLOBAL TECHNOLOGY OPERATIONS LLC.
    Inventors: Leandro G. Barajas, David W. Payton, Li Yang Ku, Ryan M. Uhlenbrock, Darren Earl