Systems and methods for providing dynamic robotic control systems
An articulated arm system is disclosed that includes an articulated arm including an end effector, and a robotic arm control systems including at least one sensor for sensing at least one of the position, movement or acceleration of the articulated arm, and a main controller for providing computational control of the articulated arm, and an on-board controller for providing, responsive to the at least one sensor, a motion signal that directly controls at least a portion of the articulated arm.
Latest Berkshire Grey Operating Company, Inc. Patents:
- Systems and methods for object processing using a vacuum gripper that provides object retention by shroud inversion
- Systems and methods for providing for the processing of objects in vehicles
- Systems and methods for providing high flow vacuum acquisition in automated systems
- Systems and methods for object processing using a passively folding vacuum gripper
- Systems and methods for processing objects including an auto-shuttle system
The present application is a continuation of U.S. patent application Ser. No. 15/254,592, filed Sep. 1, 2016, which claims priority to U.S. Provisional Patent Application Ser. No. 62/212,697 filed Sep. 1, 2015 and U.S. Provisional Patent Application Ser. No. 62/221,976 filed Sep. 22, 2015, the disclosures of which are herein incorporated by reference in their entireties.
BACKGROUNDThe invention generally relates to robotics, and relates in particular to robotic control systems that are designed to accommodate a wide variety of unexpected conditions and loads.
Most industrial robotic systems operate in a top-down manner, generally as follows: a controller samples a variety of sensors, and then logic on that same controller computes whether or not to take action. The benefit of this logic flow (usually referred to as “polling”) is that all of the control logic is in the same place. The disadvantage is that in practical robotic systems, the signals are often sampled quite slowly. Also, all sensors must be wired to the control cabinet leading to long and error-prone cable runs.
A specific example of this traditional architecture would generally be implemented by a legacy robot supplier such as those sold by ABB Robotics, Inc. of Auburn Hills, Mich., Kuka Roboter GmbH of Germany, Fanuc America Corporation of Rochester Hills, Mich., or one of their top-tier integrators. All of these suppliers generally encourage the same architecture, and have similar form factors. For example: a welding cell used in an automotive facility might have an ABB IRC5 control cabinet, an ABB IRB2600 1.85 m reach 6 degree of freedom robot, a Miller GMAW welding unit wired over an industrial bus (Devicenet/CANbus) to the IRC5, and an endo-farm tooling package mounting a GMAW torch (e.g., a Tregaskiss Tough Gun). All programming is done on the IRC5, and the end effector has no knowledge of the world, and things like crashes can only be observed or prevented on the IRC5, which is itself quite limited.
Again, in such systems, however, the signals are often sampled relatively slowly and sensors must generally be wired to the control cabinet. There remains a need therefore, for a robotic control system that is able to efficiently and reliably provide dynamic control and responsiveness to conditions in the environment of the robot.
SUMMARYIn accordance with an embodiment, the invention provides an articulated arm system that includes an articulated arm including an end effector, and a robotic arm control systems including at least one sensor for sensing at least one of the position, movement or acceleration of the articulated arm, and a main controller for providing computational control of the articulated arm, and an on-board controller for providing, responsive to the at least one sensor, a motion signal that directly controls at least a portion of the articulated arm.
In accordance with another embodiment, the invention provides an articulated arm system including an articulated arm including an end effector, and an articulated arm control system including at least one sensor for sensing at least one of the position, movement or acceleration of the articulated arm, a main controller for providing computational control of the articulated arm, and an on-board controller for providing, responsive to the at least one sensor, a control signal to the main controller.
In accordance with another embodiment, the invention provides a method of providing a control signal to an end effector of an articulated arm. The method includes the steps of providing a main control signal from a main controller to the end effector of the articulated arm, receiving a sensor input signal from at least one sensor positioned proximate the end effector, and at least partially modifying the main control signal responsive to the sensor input signal.
In accordance with a further embodiment, the invention provides a method of providing a control signal to an end effector of an articulated arm. The method includes the steps of providing a main control signal from a main controller to the end effector of the articulated arm, receiving a sensor input signal from a sensor positioned proximate the end effector, and overriding the main control signal responsive to the sensor input signal.
The following description may be further understood with reference to the accompanying drawings in which:
The drawings are shown for illustrative purposes only.
DETAILED DESCRIPTIONIn accordance with an embodiment, the invention provides an architecture for robotic end effectors that allows the end effector to alter the state of the robot. In accordance with certain embodiments, the end effector may observe the environment at a very high frequency and compare local sensor data and observations to a set of formulas or trigger events. This allows for robot-agnostic low latency motion primitive routines, such as for example move until suction and move until force without requiring the full response time of the robotic main controller. A robotic end effector is therefore provided that can alter the state of the robot, and further that may be modified during run time based on a variety of control policies. In accordance with further embodiments, the invention provides a multifaceted gripper design strategy has also been developed for multimodal gripping without tool changers.
A majority of industrial robotic systems execute their programming logic control in one place only—in the robot controller. The robot controller in these systems is often a large legacy controller with an obscure and (and sometimes poorly featured) programming language. In contrast, the majority of modern and emerging robotic systems contain logic distributed between a robot controller and several workstation computers running a modern operating system and software stack, such as the Ubuntu operating system as sold by Canonical Ltd. of Isle Of Man, the Linux operating system as provided by The Linux Foundation of San Francisco, Calif. and the ROS robotic operating environment as provided by Open Source Robotics Foundation of San Francisco, Calif.
A positive aspect of these architectures is that they provide tremendous, even arbitrary, amounts of computing power that may be directed towards problems like motion planning, localization, computer vision, etc. The downsides of this architecture are primarily that going through high-level middleware such as ROS adds significant latency, and evaluating a control policy in a loop may see round trip times of well over 100 ms.
As a unifying solution for this problem, a gripper control system has been developed with onboard electronics, sensors, and actuators to which high level logic controlling the system uploads a set of ‘triggers’ at runtime. These are control policies, such as stop the robot when a force above X Newtons is observed, or when object is observed by depth sensor, slow down the trajectory. The end effector may then evaluate the policy natively at the kHz level, and trigger actions of situations where the gripper should take an action.
In accordance with an embodiment, the invention provides an articulated arm control system that includes an articulated arm with an end effector, at least one sensor for sensing at least one of the position, movement or acceleration of the articulated arm, a main controller for providing computational control of the articulated arm, and an on-board controller for providing, responsive to the at least one sensor, a control signal to the main controller.
This solution conveys several tremendous advantages: First, one may add the advanced behaviors one generates to any robot, as long as the robot complies with a relatively simple API. Second, one may avoid long cable runs for delicate signals, from the end effector to the robot control box (which is often mounted some distance away from a work cell). Third, one may respond to changes in the environment at the speed of a native control loop, often thousands of times faster than going exclusively through high level logic and middleware. Fourth, one may alter these policies at runtime, switching from move until suction to stop on loss of suction, as well as chaining policies.
In accordance with a further embodiment, the invention provides a method of altering or overriding a control signal from a main controller to an end effector.
The electronics 2 however, is also coupled to input sensors including pressure sensors 50, 52 and 54, a camera 56, force/torque sensors 58, 60 deflection/deformation sensor 62 and flow sensor 63. These sensors are coupled to an on-board controller 64 that determines whether to send an interrupt signal to the main robotic controller, and determines whether to immediately take action by overriding any of the output signals to motors M1-M3 and the vacuum. This is achieved by having the on-board controller 64 be coupled to control junctions 66, 68, 70 and 72 in the control paths of the signals 42, 44, 46 and 48.
The robot, for example, may be working in very cluttered, dynamic environments. In order to manipulate objects in these conditions, one needs much more sensing than a typical, more structured, open-loop robotic system would need. The grippers are therefore instrumented with absolute pressure sensors, a 3D RGBD camera, force-torque sensor, and suction cup deflection sensing. By sensing and processing the sensor data directly at the wrist via a microcontroller hardware interrupts may be set (via digital inputs) immediately (hundreds/thousands of Hz). There is much more overhead in the other approach of communicating the sensor data back to the main robotic controller for analysis, which would be significantly slower. This allows one to modify robot motion/execution significantly faster, which in turn allows one to move the robot significantly faster, adapting at speeds not possible otherwise. In these dynamic and unpredictable environments, adapting and providing recovery quickly is vitally important.
The pressure sensors, for example, may provide binary gripping/not gripping, and threshold comparisons (>grip pressure, <required retract pressure, <drop pressure). The pressure sensors may also map material properties/selected grasps to expected pressure readings and in real-time modify trajectory execution (speeds, constraints) in order to ensure successful transportation. The pressure sensors may also provide real-time monitoring of upstream pressure (pressure from source) to ensure expected air pressure available, and modify expected suction measurements from downstream accordingly.
The camera may be an RGBD camera that provides data regarding environment registration, automated localization of expected environment components (conveyor, out shelves, out-bin stack) to remove hand tuning, and expected/unexpected objects/obstacles in the environment and modify trajectory execution accordingly.
The force-torque sensors may provide impulse interrupts. When an unusual or unexpected force or torque is encountered we can stop trajectory execution and recover, where the robot before would have continued its motion in collision with that object causing damage to the object or robot. The force-torque sensors may also provide mass/COM estimates, such as Model Free mass estimates that may inform trajectory execution to slow down as one may be dealing with higher mass and inertias at the endpoint, which are more likely to be dropped due to torquing off. Model Based mass estimates may also be used to ensure quality of grasp above COM, make sure that the correct item is grasped, that the item is singulated, and that the item is not damaged (unexpected mass).
The deflection/deformation sensor may observe suction cup contact with the environment (typically when one wants to interrupt motion) as the bellows are deflected and have not modified pressure readings, and have not yet displayed a noticeable force impulse. The deflection sensor at its simplest will be used for interrupting motion to avoid robot Force Protective Stops by being that earliest measurement of contact. The deflection/deformation sensor may also measure the floppiness of the picks, which allows one in real-time to again modify trajectory execution, slowing down or constraining the motions to ensure successful transport, or putting it back in the bin if the floppiness is beyond a threshold at which the item may be safely transported.
The flow sensors may detect changes in the amount of airflow as compared to expected air flow values or changes. For example, upon grasping an object, it is expected that the airflow would decrease. Once an object is grasped and is being carried or just held, a sudden increase in air flow may indicate that the grasp has been compromised or that the object has been dropped. The monitoring of weight in combination with air flow may also be employed, particularly when using high flow vacuum systems.
With reference to
If the system determines that the object should be picked up (step 608), the system will then lift the object (step 616) and then read the sensors (step 618). If the orientation of the end effector needs to be adjusted, the system adjusts the orientation of the end effector (step 620), for example to cause a heavy object to be held in tension (vertically) by the end effector as opposed to a combination of a vertical and horizontal grasp that would cause a sheer force to be applied. In another example, the system may choose the hold a lighter object with a combination of a vertical and horizontal grasp to accommodate a high speed rotation movement so that when the object is being moved, a centrifugal force will be applied in the direction aligned with the grasp of the object. Once the orientation of the end effector is chosen (step 620), the system will choose a trajectory path (step 622), and then begin execution of the trajectory, e.g., the batch program N (step 624).
With reference to
In accordance with another embodiment, the invention provides an articulated arm control system includes an articulated arm with an end effector, at least one sensor for sensing at least one of the position, movement or acceleration of the articulated arm, and a main controller for providing computational control of the articulated arm, and an on-board controller for providing, responsive to the at least one sensor, a motion signal that directly controls at least a portion of the articulated arm.
A unique contribution of the articulated arm is its multiple facets for multimodal gripping, e.g., having multiple grippers packaged on a single end effector in such a way that the robot can use different grippers by orienting the end effector of the robot differently. These facets can be combined in combinations as well as used individually. Other more common approaches are tool changers, which switch a single tool out with a different one on a rack. Multimodal gripping of the present invention reduces cycle time significantly compared to tool changers, as well as being able to combine multiple aspects of a single end effector to pick up unique objects.
The gripper designs in the above embodiments that involved the use of up to three vacuum cups, may be designed specifically for picking items of less than a certain weight, such as 2.2 lbs., out of a clutter of objects, and for grasping and manipulating the bins in which the objects were provided.
The same approach to instrumentation of a vacuum grasping end effector may be applied to any arbitrary configuration of vacuum cups as well. For example, if the robotic system needs to handle boxes such as might be used for shipping of things, then arbitrary N×M arrangements of the suction cells may be created to handle the weight ranges of such packages.
The 3×3 array that may, for example, handle up to 19.8 pound packages, and the 6×6 array that may handle up to 79.2 pounds. Such scaling of end effector sections may be made arbitrarily large, and of arbitrary shapes (if, for example, the known objects to be handled are of a particular shape as opposed to generally square/rectangular).
It is significant that by extrapolating the standard vacuum cell to arbitrary sizes/shapes, such an instrumented end effector may be designed for any given object or class of objects that shares all the benefits of such instrumentation as the above embodiments.
Those skilled in the art will appreciate that numerous variations and modifications may be made to the above disclosed embodiments without departing from the spirit and scope of the present invention.
Claims
1. An articulated arm system comprising:
- an articulated arm including a vacuum end effector; and
- an articulated arm control system including: at least one sensor for sensing a state of vacuum end effector; a main controller remote from the articulated arm and configured to automatically provide at least one main control signal that controls a vacuum provided at the vacuum end effector; and an on-board controller mounted on the articulated arm proximate the vacuum end effector and coupled to the at least one sensor, wherein the on-board controller is configured to automatically provide, responsive to an output of the at least one sensor, a vacuum control signal that modifies the at least one main control signal from the main controller to change an aspect of the vacuum provided at the vacuum end effector.
2. The articulated arm system as claimed in claim 1, wherein the on-board controller and the at least one sensor are mounted at the wrist of the articulated arm.
3. The articulated arm system as claimed in claim 1, wherein said articulated arm control system includes a plurality of sensors.
4. The articulated arm system as claimed in claim 3, wherein said plurality of sensors include any of flow sensors, pressure sensors, cameras, torque sensors and deformation sensors.
5. The articulated arm system as claimed in claim 1, wherein the articulated arm control system further includes a control junction that is coupled to the main controller and the on-board controller, and
- wherein the control junction modifies the at least one main control signal provided by the main controller using the vacuum control signal provided by the on-board controller to change the aspect of the vacuum provided at the vacuum end effector.
6. The articulated arm system as claimed in claim 5, wherein said vacuum end effector includes a plurality of end effector grippers, each of which includes a vacuum cup.
7. The articulated arm system as claimed in claim 6, wherein each end effector gripper includes at least one pressure sensor.
8. The articulated arm system as claimed in claim 6, wherein said end effector grippers are provided in an ordered array.
9. An articulated arm system comprising:
- an articulated arm including a vacuum end effector; and
- an articulated arm control system including: at least one sensor for sensing a state of the vacuum end effector; a main controller remote from the articulated arm and configured to automatically provide at least one main control signal that controls a vacuum provided at the vacuum end effector; and an on-board controller mounted on the articulated arm proximate the vacuum end effector and coupled to the at least one sensor, wherein the on-board controller is configured to automatically provide, responsive to an output of the at least one sensor, a vacuum control signal that overrides the at least one main control signal from the main controller to change an aspect of the vacuum provided at the vacuum end effector.
10. The articulated arm system as claimed in claim 9, wherein said articulated arm control system includes a plurality of sensors.
11. The articulated arm system as claimed in claim 10, wherein said plurality of sensors include any of flow sensors, pressure sensors, cameras, torque sensors and deformation sensors.
12. The articulated arm system as claimed in claim 9, wherein the articulated arm control system further includes a control junction that is coupled to the main controller and the on-board controller for controlling the vacuum provided at the vacuum end effector, and
- wherein the control junction overrides the at least one main control signal provided by the main controller with the vacuum control signal provided by the on-board controller to change the aspect of the vacuum provided at the vacuum end effector.
13. The articulated arm system as claimed in claim 9, wherein said vacuum end effector includes a plurality of end effector grippers, each of which includes a vacuum cup.
14. The articulated arm system as claimed in claim 13, wherein each end effector gripper includes at least one pressure sensor.
15. The articulated arm system as claimed in claim 13, wherein said end effector grippers are provided in an ordered array.
16. The articulated arm system as claimed in claim 9, wherein the on-board controller and the at least one sensor are mounted at the wrist of the articulated arm.
17. A method of controlling a vacuum end effector of an articulated arm, the method comprising:
- providing a main control signal from a main controller remote from an articulated arm to control a vacuum provided at the vacuum end effector of the articulated arm;
- receiving by an on-board controller at least one output signal from at least one sensor that senses a state of the vacuum end effector, wherein the on-board controller and the at least one sensor are mounted on the articulated arm proximate the vacuum end effector; and
- at least partially modifying the main control signal using a vacuum control signal provided by the on-board controller responsive to the at least one output signal from the at least one sensor to automatically provide a modified main control signal that changes an aspect of the vacuum provided at the vacuum end effector.
18. The method as claimed in claim 17, wherein the main controller and the on-board controller are coupled to a control junction that modifies the at least one main control signal received from the main controller using the vacuum control signal received from the on-board controller.
19. The method as claimed in claim 17, further comprises receiving output signals from a plurality of sensors.
20. The method as claimed in claim 19, wherein said plurality of sensors include any of flow sensors, pressure sensors, cameras, torque sensors and deformation sensors.
21. A method of controlling a vacuum end effector of an articulated arm, the method comprising:
- providing a main control signal from a main controller remote from the articulated arm to control a vacuum provided at the vacuum end effector of the articulated arm;
- receiving by an on-board controller at least one output signal from at least one sensor that senses a state of the vacuum end effector, wherein the on-board controller and the at least one sensor are mounted on the articulated arm proximate the vacuum end effector; and
- overriding the main control signal using a vacuum control signal provided by the on-board controller responsive to the at least one output signal from the at least one sensor to provide an overridden main control signal to that changes an aspect of the vacuum at the vacuum end effector.
22. The method as claimed in claim 21, wherein the main controller and the on-board controller are coupled to a control junction that overrides the at least one main control signal received from the main controller using the vacuum control signal received from the on-board controller.
23. The method as claimed in claim 21, further comprises receiving output signals from a plurality of sensors.
24. The method as claimed in claim 23, wherein said plurality of sensors include any of flow sensors, pressure sensors, cameras, torque sensors and deformation sensors.
25. The method as claimed in claim 17, wherein the on-board controller and the at least one sensor are mounted at the wrist of the articulated arm.
26. The method as claimed in claim 21, wherein the on-board controller and the at least one sensor are mounted at the wrist of the articulated arm.
4557659 | December 10, 1985 | Scaglia |
4604787 | August 12, 1986 | Silvers, Jr. |
4677778 | July 7, 1987 | Sorimachi et al. |
4786847 | November 22, 1988 | Daggett et al. |
4896357 | January 23, 1990 | Hatano et al. |
5764013 | June 9, 1998 | Yae |
5777267 | July 7, 1998 | Szydel |
5860900 | January 19, 1999 | Dunning et al. |
5865487 | February 2, 1999 | Gore et al. |
6059092 | May 9, 2000 | Jerue et al. |
6446175 | September 3, 2002 | West et al. |
6817639 | November 16, 2004 | Schmalz et al. |
7263890 | September 4, 2007 | Takahashi |
7313464 | December 25, 2007 | Perreault et al. |
7474939 | January 6, 2009 | Oda et al. |
7677622 | March 16, 2010 | Dunkmann et al. |
8070203 | December 6, 2011 | Schaumberger |
8874270 | October 28, 2014 | Ando |
8936291 | January 20, 2015 | Yasuda et al. |
9061868 | June 23, 2015 | Paulsen et al. |
9227323 | January 5, 2016 | Konolige et al. |
9259844 | February 16, 2016 | Xu et al. |
9266237 | February 23, 2016 | Nomura |
9283680 | March 15, 2016 | Yasuda et al. |
9486926 | November 8, 2016 | Kawano |
9492923 | November 15, 2016 | Wellman et al. |
9604363 | March 28, 2017 | Ban |
9687982 | June 27, 2017 | Jules et al. |
9981379 | May 29, 2018 | Youmans et al. |
9999977 | June 19, 2018 | Wagner et al. |
10007827 | June 26, 2018 | Wagner et al. |
10118300 | November 6, 2018 | Wagner et al. |
10315315 | June 11, 2019 | Wagner et al. |
10335956 | July 2, 2019 | Wagner et al. |
10399236 | September 3, 2019 | Wagner et al. |
20010056313 | December 27, 2001 | Osborne, Jr. |
20020068994 | June 6, 2002 | Hong |
20020157919 | October 31, 2002 | Sherwin |
20030075051 | April 24, 2003 | Watanabe et al. |
20060242785 | November 2, 2006 | Cawley et al. |
20100040450 | February 18, 2010 | Parnell |
20100094461 | April 15, 2010 | Roth et al. |
20100101346 | April 29, 2010 | Johnson et al. |
20100109360 | May 6, 2010 | Meisho |
20100125361 | May 20, 2010 | Mougin et al. |
20100175487 | July 15, 2010 | Sato |
20100180711 | July 22, 2010 | Kilibarda et al. |
20100234857 | September 16, 2010 | Itkowitz et al. |
20100241260 | September 23, 2010 | Kilibarda et al. |
20110176148 | July 21, 2011 | Briggs et al. |
20110206494 | August 25, 2011 | Lockie |
20110243707 | October 6, 2011 | Dumas et al. |
20130006417 | January 3, 2013 | Sanders et al. |
20130110280 | May 2, 2013 | Folk |
20130166061 | June 27, 2013 | Yamamoto |
20130218335 | August 22, 2013 | Barajas et al. |
20130232919 | September 12, 2013 | Jaconelli |
20130245824 | September 19, 2013 | Barajas et al. |
20130297046 | November 7, 2013 | Hendron |
20130343640 | December 26, 2013 | Buehler et al. |
20130345872 | December 26, 2013 | Brooks et al. |
20140005831 | January 2, 2014 | Naderer et al. |
20140067121 | March 6, 2014 | Brooks et al. |
20140067127 | March 6, 2014 | Gotou |
20140088763 | March 27, 2014 | Hazan |
20140154036 | June 5, 2014 | Matttern et al. |
20140200711 | July 17, 2014 | Douba et al. |
20140244026 | August 28, 2014 | Neiser |
20140298231 | October 2, 2014 | Saito et al. |
20140305847 | October 16, 2014 | Kudrus |
20150032252 | January 29, 2015 | Galluzzo et al. |
20150057793 | February 26, 2015 | Kawano |
20150073589 | March 12, 2015 | Kohodl et al. |
20150081090 | March 19, 2015 | Dong |
20150190925 | July 9, 2015 | Hoffman et al. |
20150203340 | July 23, 2015 | Jacobsen et al. |
20150224650 | August 13, 2015 | Xu et al. |
20150298316 | October 22, 2015 | Accou et al. |
20150306770 | October 29, 2015 | Mittal et al. |
20150328779 | November 19, 2015 | Bowman et al. |
20150346708 | December 3, 2015 | Mattern et al. |
20150352721 | December 10, 2015 | Wicks et al. |
20150375398 | December 31, 2015 | Penn et al. |
20150375401 | December 31, 2015 | Dunkmann et al. |
20160031077 | February 4, 2016 | Inaba et al. |
20160101526 | April 14, 2016 | Saito et al. |
20160136816 | May 19, 2016 | Pistorino |
20160167227 | June 16, 2016 | Wellman et al. |
20160176043 | June 23, 2016 | Mishra et al. |
20160221187 | August 4, 2016 | Bradski et al. |
20160243704 | August 25, 2016 | Vakanski et al. |
20160271805 | September 22, 2016 | Kuolt et al. |
20160347545 | December 1, 2016 | Lindbo et al. |
20170021499 | January 26, 2017 | Wellman et al. |
20170036354 | February 9, 2017 | Chavan Dafle et al. |
20170043953 | February 16, 2017 | Battles et al. |
20170050315 | February 23, 2017 | Henry et al. |
20170057091 | March 2, 2017 | Wagner et al. |
20170080566 | March 23, 2017 | Stubbs et al. |
20170080579 | March 23, 2017 | Wagner et al. |
20170087718 | March 30, 2017 | Wagner et al. |
20170087731 | March 30, 2017 | Wagner et al. |
20170106532 | April 20, 2017 | Wellman et al. |
20170120455 | May 4, 2017 | Wagner et al. |
20170121113 | May 4, 2017 | Wagner et al. |
20170136632 | May 18, 2017 | Wagner et al. |
20170157648 | June 8, 2017 | Wagner et al. |
20170197316 | July 13, 2017 | Wagner et al. |
20170225330 | August 10, 2017 | Wagner et al. |
20170305694 | October 26, 2017 | McMurrough et al. |
20170322561 | November 9, 2017 | Stiernagle |
20180043527 | February 15, 2018 | Koga |
20180127219 | May 10, 2018 | Wagner et al. |
20180148272 | May 31, 2018 | Wagner et al. |
20180264649 | September 20, 2018 | Ojima |
20180265298 | September 20, 2018 | Wagner et al. |
20180273295 | September 27, 2018 | Wagner et al. |
20180273296 | September 27, 2018 | Wagner et al. |
20180273297 | September 27, 2018 | Wagner et al. |
20180273298 | September 27, 2018 | Wagner et al. |
20180281202 | October 4, 2018 | Brudniok et al. |
20180282065 | October 4, 2018 | Wagner et al. |
20180282066 | October 4, 2018 | Wagner et al. |
20180312336 | November 1, 2018 | Wagner et al. |
20180327198 | November 15, 2018 | Wagner et al. |
20180330134 | November 15, 2018 | Wagner et al. |
20180333749 | November 22, 2018 | Wagner et al. |
20190001505 | January 3, 2019 | Wagner et al. |
20190329979 | October 31, 2019 | Wicks et al. |
2928645 | April 2015 | CA |
701886 | March 2011 | CH |
103648730 | March 2014 | CN |
104137051 | November 2014 | CN |
0317020 | May 1989 | EP |
0613841 | September 1994 | EP |
701886 | March 1996 | EP |
1256421 | November 2002 | EP |
1671906 | June 2006 | EP |
2181814 | May 2010 | EP |
2960024 | December 2015 | EP |
2010034044 | April 2010 | WO |
2015162390 | October 2015 | WO |
2016070412 | May 2016 | WO |
2017044632 | March 2017 | WO |
2018017616 | July 2017 | WO |
- Notice of Second Office Action and Second Office (along with its English Translation) issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 201680064037.8 dated May 26, 2021, 4 pages.
- Extended European search Report issued by the European Patent Office in related European Patent Application No. 20186543.3 dated Oct. 30, 2020, 10 pages.
- Office Action issued by Innovation, Science and Economic Development Canada in related Canadian Patent Application No. 2,997,280 dated Jan. 4, 2021, 3 pages.
- Notice of First Office Action and First Office (along with its English Translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 201680064037.8 dated Oct. 12, 2020, 16 pages.
- Communication pursuant to Rules 161(1) and 162 EPC issued by the European Patent Office in related European Patent Application No. 16766742.7 dated Apr. 10, 2018, 3 pages.
- Hebert et al., A Robotic Gripper System for Limp Material Manipulation: Hardware and Software Development and Integration, Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Albuquerque, Apr. 20-25, 1997, Proceedings of the IEEE International Conference on Robotics and Automation, New York, vol. Conf. 14, Apr. 20, 1997.
- Moura et al., Neural Network Based Perturbation Identification Approach for High Accuracy Tracking Control of Robotic Manipulators, Proceedings of the International Mechanical Engineering Congress and Exposition, IMECE—ASME, Nov. 1, 2003, pp. 1189-1197.
- Vittor et al., A Flexible Robotic Gripper for Automation of Assembly Tasks: A Technology Study on a Gripper for Operation in Shared Human Environments, Assembly and Manufacturing (ISAM), 2011 IEEE International Symposium on, IEEE, May 25, 2011.
- Liu et al., Hand-arm Coordination for a Tomato Harvesting Robot based on Commercial Manipulator, 2013 IEEE International Conference on Robotics and Biomimetics (ROBIO), IEEE, Dec. 12, 2013.
- International Search Report and Written Opinion of the International Searching Authority in related International Application No. PCT/US2016/049935 dated Nov. 18, 2016, 14 pages.
- International Preliminary Report on Patentability issued by the International Bureau of WIPO in related International Application No. PCT/US2016/049935 dated Mar. 6, 2018, 10 pages.
- Office Action issued by Canadian Intellectual Property Office in related Canadian Patent Application No. 2,997,280 dated Jun. 6, 2019, 4 pages.
- Non-Final Office Action issued by the U.S. Patent and Trademark Office dated May 21, 2018 in related U.S. Appl. No. 15/254,592, 9 pages.
- Final Office Action issued by the U.S. Patent and Trademark Office dated Dec. 7, 2018 in related U.S. Appl. No. 15/254,592, 14 pages.
- Non-Final Office Action issued by the U.S. Patent and Trademark Office dated Jul. 3, 2019 in related U.S. Appl. No. 15/254,592, 17 pages.
Type: Grant
Filed: Mar 24, 2020
Date of Patent: Jun 28, 2022
Patent Publication Number: 20200223072
Assignee: Berkshire Grey Operating Company, Inc. (Bedford, MA)
Inventors: Thomas Wagner (Concord, MA), Kevin Ahearn (Fort Mill, SC), Matthew T. Mason (Pittsburgh, PA), Christopher Geyer (Arlington, MA), Thomas Koletschka (Cambridge, MA), Prasanna Velagapudi (Pittsburgh, PA), Michael Dawson-Haggerty (Pittsburgh, PA), Siddhartha Srinivasa (Seattle, WA), Kyle Maroney (North Attleboro, MA), Joseph Romano (Arlington, MA), Daniel Smith (Canonsburg, PA), Gene Temple Price (Cambridge, MA), Thomas Allen (Reading, MA)
Primary Examiner: Ryan Rink
Application Number: 16/828,029
International Classification: B25J 9/16 (20060101); B25J 15/06 (20060101);