Patents by Inventor Michael Siracusa
Michael Siracusa has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 9701015Abstract: Via intuitive interactions with a user, robots may be trained to perform tasks such as visually detecting and identifying physical objects and/or manipulating objects. In some embodiments, training is facilitated by the robot's simulation of task-execution using augmented-reality techniques.Type: GrantFiled: June 24, 2015Date of Patent: July 11, 2017Assignee: RETHINK ROBOTICS, INC.Inventors: Christopher J. Buehler, Michael Siracusa
-
Patent number: 9669544Abstract: Via intuitive interactions with a user, robots may be trained to perform tasks such as visually detecting and identifying physical objects and/or manipulating objects. In some embodiments, training is facilitated by the robot's simulation of task-execution using augmented-reality techniques.Type: GrantFiled: June 24, 2015Date of Patent: June 6, 2017Assignee: RETHINK ROBOTICS, INC.Inventors: Christopher J. Buehler, Michael Siracusa
-
Patent number: 9434072Abstract: Via intuitive interactions with a user, robots may be trained to perform tasks such as visually detecting and identifying physical objects and/or manipulating objects. In some embodiments, training is facilitated by the robot's simulation of task-execution using augmented-reality techniques.Type: GrantFiled: September 17, 2012Date of Patent: September 6, 2016Assignee: Rethink Robotics, Inc.Inventors: Christopher J. Buehler, Michael Siracusa
-
Publication number: 20150290802Abstract: Via intuitive interactions with a user, robots may be trained to perform tasks such as visually detecting and identifying physical objects and/or manipulating objects. In some embodiments, training is facilitated by the robot's simulation of task-execution using augmented-reality techniques.Type: ApplicationFiled: June 24, 2015Publication date: October 15, 2015Applicant: Rethink Robotics, Inc.Inventors: Christopher J. Buehler, Michael Siracusa
-
Publication number: 20150290803Abstract: Via intuitive interactions with a user, robots may be trained to perform tasks such as visually detecting and identifying physical objects and/or manipulating objects. In some embodiments, training is facilitated by the robot's simulation of task-execution using augmented-reality techniques.Type: ApplicationFiled: June 24, 2015Publication date: October 15, 2015Applicant: RETHINK ROBOTICS, INC.Inventors: Christopher J. Buehler, Michael Siracusa
-
Patent number: 9092698Abstract: Via intuitive interactions with a user, robots may be trained to perform tasks such as visually detecting and identifying physical objects and/or manipulating objects. In some embodiments, training is facilitated by the robot's simulation of task-execution using augmented-reality techniques.Type: GrantFiled: September 17, 2012Date of Patent: July 28, 2015Assignee: Rethink Robotics, Inc.Inventors: Christopher J. Buehler, Michael Siracusa
-
Patent number: 8996175Abstract: Robots may manipulate objects based on sensor input about the objects and/or the environment in conjunction with data structures representing primitive tasks and, in some embodiments, objects and/or locations associated therewith. The data structures may be created by instantiating respective prototypes during training by a human trainer.Type: GrantFiled: September 17, 2012Date of Patent: March 31, 2015Assignee: Rethink Robotics, Inc.Inventors: Bruce Blumberg, Rodney Brooks, Christopher J. Buehler, Noelle Dye, Gerry Ens, Natan Linder, Michael Siracusa, Michael Sussman, Matthew M. Williamson
-
Patent number: 8965580Abstract: Robots may manipulate objects based on sensor input about the objects and/or the environment in conjunction with data structures representing primitive tasks and, in some embodiments, objects and/or locations associated therewith. The data structures may be created by instantiating respective prototypes during training by a human trainer.Type: GrantFiled: September 17, 2012Date of Patent: February 24, 2015Assignee: Rethink Robotics, Inc.Inventors: Rodney Brooks, Christopher J. Buehler, Matthew DiCicco, Gerry Ens, Albert Huang, Michael Siracusa, Matthew M. Williamson
-
Patent number: 8958912Abstract: Robots may manipulate objects based on sensor input about the objects and/or the environment in conjunction with data structures representing primitive tasks and, in some embodiments, objects and/or locations associated therewith. The data structures may be created by instantiating respective prototypes during training by a human trainer.Type: GrantFiled: September 17, 2012Date of Patent: February 17, 2015Assignee: Rethink Robotics, Inc.Inventors: Bruce Blumberg, Rodney Brooks, Christopher J. Buehler, Patrick A. Deegan, Matthew DiCicco, Noelle Dye, Gerry Ens, Natan Linder, Michael Siracusa, Michael Sussman, Matthew M. Williamson
-
Publication number: 20130345873Abstract: Robots may manipulate objects based on sensor input about the objects and/or the environment in conjunction with data structures representing primitive tasks and, in some embodiments, objects and/or locations associated therewith. The data structures may be created by instantiating respective prototypes during training by a human trainer.Type: ApplicationFiled: September 17, 2012Publication date: December 26, 2013Applicant: Rethink Robotics, Inc.Inventors: Bruce Blumberg, Rodney Brooks, Christopher J. Buehler, Patrick A. Deegan, Matthew DiCicco, Noelle Dye, Gerry Ens, Natan Linder, Michael Siracusa, Michael Sussman, Matthew M. Williamson
-
Publication number: 20130345875Abstract: Robots may manipulate objects based on sensor input about the objects and/or the environment in conjunction with data structures representing primitive tasks and, in some embodiments, objects and/or locations associated therewith. The data structures may be created by instantiating respective prototypes during training by a human trainer.Type: ApplicationFiled: September 17, 2012Publication date: December 26, 2013Applicant: Rethink Robotics, Inc.Inventors: Rodney Brooks, Christopher J. Buehler, Matthew DiCicco, Gerry Ens, Albert Huang, Michael Siracusa, Matthew M. Williamson
-
Publication number: 20130346348Abstract: Via intuitive interactions with a user, robots may be trained to perform tasks such as visually detecting and identifying physical objects and/or manipulating objects. In some embodiments, training is facilitated by the robot's simulation of task-execution using augmented-reality techniques.Type: ApplicationFiled: September 17, 2012Publication date: December 26, 2013Applicant: Rethink Robotics, Inc.Inventors: Christopher J. Buehler, Michael Siracusa
-
Publication number: 20130345874Abstract: Robots may manipulate objects based on sensor input about the objects and/or the environment in conjunction with data structures representing primitive tasks and, in some embodiments, objects and/or locations associated therewith. The data structures may be created by instantiating respective prototypes during training by a human trainer.Type: ApplicationFiled: September 17, 2012Publication date: December 26, 2013Applicant: Rethink Robotics, Inc.Inventors: Bruce Blumberg, Rodney Brooks, Christopher J. Buehler, Noelle Dye, Gerry Ens, Natan Linder, Michael Siracusa, Michael Sussman, Matthew M. Williamson
-
Patent number: 7558809Abstract: A method classifies segments of a video using an audio signal of the video and a set of classes. Selected classes of the set are combined as a subset of important classes, the subset of important classes being important for a specific highlighting task, the remaining classes of the set are combined as a subset of other classes. The subset of important classes and classes are trained with training audio data to form a task specific classifier. Then, the audio signal can be classified using the task specific classifier as either important or other to identify highlights in the video corresponding to the specific highlighting task. The classified audio signal can be used to segment and summarize the video.Type: GrantFiled: January 6, 2006Date of Patent: July 7, 2009Assignee: Mitsubishi Electric Research Laboratories, Inc.Inventors: Regunathan Radhakrishnan, Michael Siracusa, Ajay Divakaran
-
Publication number: 20070162924Abstract: A method classifies segments of a video using an audio signal of the video and a set of classes. Selected classes of the set are combined as a subset of important classes, the subset of important classes being important for a specific highlighting task, the remaining classes of the set are combined as a subset of other classes. The subset of important classes and classes are trained with training audio data to form a task specific classifier. Then, the audio signal can be classified using the task specific classifier as either important or other to identify highlights in the video corresponding to the specific highlighting task. The classified audio signal can be used to segment and summarize the video.Type: ApplicationFiled: January 6, 2006Publication date: July 12, 2007Inventors: Regunathan Radhakrishnan, Michael Siracusa, Ajay Divakaran