System and Method for Teaching a Robot to Mimic Precise Human Motions for Creation of Coffee and Beverages
A method for recording human motion performing a task with a tool and training a robot to perform the task, including the steps of performing tracker calibration without the robot present, performing teaching tool calibration with the robot present and placed at a predetermined location, and performing training without the robot present.
The present application is a United States National Stage (§ 371) application of PCT/US21/33727 filed May 21, 2021, which claims the benefit of U.S. Provisional Patent Application Ser. No. 63/028,109, filed on May 21, 2020, the contents of which are hereby incorporated by reference in their entirety.
FIELD OF THE INVENTIONThe present invention relates to the field of teaching robots to mimic human motions through a system that can capture and track detailed human motion carrying out a task and precisely and accurately play back the exact motion on robot arm manipulators.
BACKGROUNDTraditional methods to teach robots to achieve certain tasks involve guiding or teloperating the robot, and programming the robot to repeat the exact motion. Recently, there has been progress in a method called Programming by Demonstration, or Imitation Learning, described in Billard, A, Calinon, S., Dillmann, R., & Schaal, S. (2008). Survey: Robot programming by demonstration, Handbook of robotics, 59, that is capable of teaching a robot how to perform a task by repeated demonstrations, generalizing the motion and reproducing the motion on the robot. This method is appealing because it allows the robot to perform the task without requiring it to start at the exact initial conditions and the motion can be reproduced in different situations and contexts. For example, a picking task can be taught to the robot so that it can be used to pick up items of different sizes and shapes. However, Imitation Learning has difficulty teaching detailed human motion required to perform intricate tasks, such as working with a portafilter with a specialty coffee machine, or pouring milk from a milk pitcher to a cup of coffee to create latte art.
BRIEF SUMMARYThe present disclosure is directed to a modular robotic coffee barista station that is configured to prepare espresso drinks, cold brew, iced coffee, and drip coffee using commercial coffee equipment. The present disclosure also focuses on capturing precise human motion in order to aid a robot to complete food and beverage making tasks.
In accordance with another aspect of the present disclosure, a robotic coffee preparation and serving station is provided that includes a six-axis robot arm controlled by executable software on a processor, robot end-effector that can be controlled by the processor and a motion capture system that can capture motion with respect to a tool (such as a milk pitcher) with a high precision.
In accordance with another aspect of the present disclosure, a method of guiding a robot with an end-effector to perform an exact motion with a tool (such as a milk pitcher) according to precisely captured human motion is provided.
In accordance with another aspect of the foregoing disclosure, a robotic coffee preparation and serving station is provided that includes a six-axis robot arm controlled by executable software on a processor, robot end-effector that can be controlled by the processor, a motion capture system that can capture motion with respect to a tool (such as a milk pitcher) with a high precision, and some additional hardware (such as a tilting platform to hold a cup) that can be controlled by the processor.
In accordance with another aspect of the present disclosure, a method of guiding a robot with an end-effector to perform an exact motion with a tool (such as a milk pitcher) according to precisely captured human motion which is synchronized with some additional hardware (such as a tilting platform to hold a cup) is provided.
The foregoing and other features and advantages of the present disclosure will be more readily appreciated as the same become better understood from the following detailed description when taken in conjunction with the accompanying drawings, wherein:
This invention describes a system that allows one to capture complex and intricate human motions, such that captured human motion can be reproduced by a robotic manipulator arm precisely and accurately. The system may include two major steps, calibration shown in
As shown in
The system includes motion capture devices and a robotic manipulator arm 2 capable of executing a trajectory of joint angle movements. Motion capture receiver 7 captures signals from motion tracking devices 8 and 9 and records motion trajectories in the motion capture coordinate system. Various motion capturing systems may be used including, but not limited to VICON, HTC VIVE, MOCAP, and OptiTrack. The motion capturing system must be able to provide sufficient accuracy (typically below 1 millimeter). This invention also may be used with various robotic arms. The robotic arm should be able to repeat motions with sufficient accuracy (typically below 1 millimeter). The robot should have encoders in, on or near its joints so that joint angles of its current state can be measured with sufficient accuracy (typically below 1 degree).
In one aspect, the system of the present application employs the following methods to compute this transformation. As shown in
In Tracker Calibration Process (P1) shown in
In Teach Tool Calibration Process (P2) shown in
Given the relative transformation between the tracking device 9 and the robot end-effector 3 when the teaching tool 1 is being grasped at grasping tool 15, the process to capture and reproduce precise human motions is described below. As shown in
In one aspect, either Tracker Calibration Process (P1) or Teach Tool Calibration Process (P2) may be performed first and once per teaching tool. Once those processes are completed, the system may proceed to Learning from Human Process (P3), shown in
The system may be used with other customized hardware to teach the robot to perform specific tasks. One such example is shown in
Traditionally, such complex motion would require two robotic arms, one to hold the cup 14 and one to hold the milk pitcher 1. This invention allows a more cost effective system that achieves the same level of accuracy and synchronization between two moving platforms, i.e., the tilting platform 10 and the robotic arm 2.
As desired, embodiments of the disclosure may include systems with more or fewer components than are illustrated in the drawings. Additionally, certain components of the systems may be combined in various embodiments of the disclosure. The systems described above are provided by way of example only.
The above description presents the best mode contemplated for carrying out the present embodiments, and of the manner and process of practicing them, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which they pertain to practice these embodiments. The present embodiments are, however, susceptible to modifications and alternate constructions from those discussed above that are fully equivalent. Consequently, the present invention is not limited to the particular embodiments disclosed. On the contrary, the present invention covers all modifications and alternate constructions coming within the spirit and scope of the present disclosure. For example, the steps in the processes described herein need not be performed in the same order as they have been presented, and may be performed in any order(s). Further, steps that have been presented as being performed separately may in alternative embodiments be performed concurrently. Likewise, steps that have been presented as being performed concurrently may in alternative embodiments be performed separately.
Claims
1. A method for recording human motion performing a task with a tool and training a robot to perform the task, comprising the steps of:
- a. performing tracker calibration without the robot present, comprising the steps of: i. receiving a first location signal from a first tracking device at a known location; ii. receiving a second location signal from a second tracking device in close proximity to the tool, where the tool is placed at a tool position and orientation at a tool location; iii. computing a relative position of the second tracking device based on the first location signal, the second location signal; and iv. computing a position and orientation of the second tracking device based on the relative position and the known location of the first tracking device;
- b. performing teaching tool calibration with the robot present and placed at a predetermined location, comprising the steps of: i. placing the tool at the tool position and orientation at the tool location; ii. grasping a portion of the tool with an end-effector of the robot; iii. computing a position and orientation of the end-effector based on forward kinematics; and iv. computing a relative position and orientation difference between the end effector and the tool based on the position and orientation of the second tracking device and the position and orientation of the end-effector;
- C. performing training without the robot present, comprising the steps of: i. placing the tool, with a third tracking device affixed thereto, at a predetermined position and orientation; ii. receiving location information from the third tracking device and recording a motion trajectory of the tool; iii. transforming the recorded motion trajectory of the tool to a world coordinate system using the position and orientation of the tool; and iv. transforming the recorded motion trajectory of the tool to a motion trajectory of the end-effector using the relative position and orientation difference.
2. The method of claim 1, where the second tracking device is the third tracking device.
3. The method of claim 1, wherein the motion trajectory of the tool is based on a human performing the task while handling the tool.
4. The method of claim 1, further comprising receiving and recording input from a human to move a platform capable of holding an open top liquid container.
Type: Application
Filed: May 21, 2021
Publication Date: Sep 5, 2024
Inventors: LIU SHUO (Kirkland, OR), XUCHU DING (Seattle, WA), MENG WANG (Seattle, WA), YUSHAN CHEN (Seattle, WA), WENBO YANG (Seattle, WA)
Application Number: 18/273,912