Patents by Inventor Steven Lines
Steven Lines has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11872702Abstract: Embodiments provide functionality to prevent collisions between robots and objects. An example embodiment detects a type and a location of an object based on a camera image of the object, where the image has a reference frame. Motion of the object is then predicted based on at least one of: the detected type of the object, the detected location of the object, and a model of object motion. To continue, a motion plan for the robot is generated that avoids having the robot collide with the object based on the predicted motion of the object and a transformation between the reference frame of the image and a reference frame of the robot. The robot can be controlled to move in accordance with the motion plan or a signal can be generated that controls the robot to operate in accordance with the motion plan.Type: GrantFiled: September 13, 2019Date of Patent: January 16, 2024Assignee: The Charles Stark Draper Laboratory, Inc.Inventors: David M. S. Johnson, Syler Wagner, Steven Lines, Mitchell Hebert, Connor Lawson
-
Patent number: 11648669Abstract: In an embodiment, a method for handling an order includes determining a plurality of ingredients based on an order, received from a user over a network, for a location having a plurality of robots. The method further includes planning at least one trajectory for at least one robot based on the plurality of ingredients and utensils available at the location, and proximity of each ingredient and utensil to the at least one robot. Each trajectory can be configured to move one of the plurality of ingredients into a container associated with the order. In an embodiment, the method includes executing the at least one trajectory by the at least one robot to fulfill the order. In an embodiment, the method includes moving the container to a pickup area.Type: GrantFiled: September 13, 2019Date of Patent: May 16, 2023Assignee: The Charles Stark Draper Laboratory, Inc.Inventors: David M. S. Johnson, Syler Wagner, Steven Lines, Cody Chu, Anthony Tayoun, Mitchell Hebert
-
Patent number: 11628566Abstract: In an embodiment, a method and system use various sensors to determine a shape of a collection of materials (e.g., foodstuffs). A controller can determine a trajectory which achieves the desired end-state, possibly chosen from a set of feasible, collision-free trajectories to execute, and a robot executes that trajectory. The robot, executing that trajectory, scoops, grabs, or otherwise acquires the desired amount of material from the collection of materials at a desired location. The robot then deposits the collected material in the desired receptacle at a specific location and orientation.Type: GrantFiled: September 13, 2019Date of Patent: April 18, 2023Assignee: The Charles Stark Draper Laboratory, Inc.Inventors: David M. S. Johnson, Syler Wagner, Steven Lines, Mitchell Hebert
-
Patent number: 11607810Abstract: Robots, including robot arms, can interface with other modules to affect the world surrounding the robot. However, designing modules from scratch when human analogues exist is not efficient. In an embodiment, a mechanical tool, converted from human use, to be used by robots includes a monolithic adaptor having two interface components. The two interface components include a first interface component cabal be of mating with an actuated mechanism on the robot side, the second interface capable of clamping to an existing utensil. In such a way, utensils that are intended for humans can be adapted for robots and robotic arms.Type: GrantFiled: September 13, 2019Date of Patent: March 21, 2023Assignee: The Charles Stark Draper Laboratory, Inc.Inventors: David M.S. Johnson, Justin Rooney, Syler Wagner, Steven Lines, Cody Chu, Anthony Tayoun
-
Patent number: 11597085Abstract: Current technologies allow a robot to acquire a tool only if the tool is in a set known location, such as in a rack. In an embodiment, a method and corresponding system, can determine the previously unknown pose of a tool freely placed in an environment. The method can then calculate a trajectory that allows for a robot to move from its current position to the tool and attach with the tool. In such a way, tools can be located and used by a robot when placed at any location in an environment.Type: GrantFiled: September 13, 2019Date of Patent: March 7, 2023Assignee: The Charles Stark Draper Laboratory, Inc.Inventors: David M. S. Johnson, Syler Wagner, Justin Rooney, Steven Lines, Anthony Tayoun, Mitchell Hebert
-
Patent number: 11597084Abstract: In an embodiment, a method includes identifying a force and torque for a robot to accomplish a task and identifying context of a portion of a movement plan indicating motion of the robot to perform the task. Based on the identified force, torque, and context, a context specific torque is determined for at least one aspect of the robot while the robot executes the portion of the movement plan. In turn, a control signal is generated for the at least one aspect of the robot to operate in accordance with the determined context specific torque.Type: GrantFiled: September 13, 2019Date of Patent: March 7, 2023Assignee: The Charles Stark Draper Laboratory, Inc.Inventors: David M. S. Johnson, Syler Wagner, Steven Lines
-
Patent number: 11597087Abstract: In an embodiment, a method during execution of a motion plan by a robotic arm includes determining a voice command from speech of a user said during the execution of the motion plan, determining a modification of the motion plan based on the voice command from the speech of the user, and executing the modification of the motion plan by the robotic arm.Type: GrantFiled: September 13, 2019Date of Patent: March 7, 2023Assignee: The Charles Stark Draper Laboratory, Inc.Inventors: David M. S. Johnson, Syler Wagner, Anthony Tayoun, Steven Lines
-
Patent number: 11571814Abstract: In an embodiment, a method includes determining a given material to manipulate to achieve a goal state. The goal state can be one or more deformable or granular materials in a particular arrangement. The method further includes, for the given material, determining, a respective outcome for each of a plurality of candidate actions to manipulate the given material. The determining can be performed with a physics-based model, in one embodiment. The method further can include determining a given action of the candidate actions, where the outcome of the given action reaching the goal state is within at least one tolerance. The method further includes, based on a selected action of the given actions, generating a first motion plan for the selected action.Type: GrantFiled: September 13, 2019Date of Patent: February 7, 2023Assignee: The Charles Stark Draper Laboratory, Inc.Inventors: David M. S. Johnson, Syler Wagner, Steven Lines, Mitchell Hebert
-
Publication number: 20200086437Abstract: Current technologies allow a robot to acquire a tool only if the tool is in a set known location, such as in a rack. In an embodiment, a method and corresponding system, can determine the previously unknow pose of a tool freely placed in an environment. The method can then calculate a trajectory that allows for a robot to move from its current position to the tool and attach with the tool. In such a way, tools can be located and used by a robot when placed at any location in an environment.Type: ApplicationFiled: September 13, 2019Publication date: March 19, 2020Inventors: David M.S. Johnson, Syler Wagner, Justin Rooney, Steven Lines, Anthony Tayoun
-
Publication number: 20200087069Abstract: In an embodiment, a method for handling an order includes determining a plurality of ingredients based on an order, received from a user over a network, for a location having a plurality of robots. The method further includes planning at least one trajectory for at least one robot based on the plurality of ingredients and utensils available at the location, and proximity of each ingredient and utensil to the at least one robot. Each trajectory can be configured to move one of the plurality of ingredients into a container associated with the order. In an embodiment, the method includes executing the at least one trajectory by the at least one robot to fulfill the order. In an embodiment, the method includes moving the container to a pickup area.Type: ApplicationFiled: September 13, 2019Publication date: March 19, 2020Inventors: David M.S. Johnson, Syler Wagner, Steven Lines, Cody Chu, Anthony Tayoun
-
Publication number: 20200086487Abstract: Embodiments provide functionality to prevent collisions between robots and objects. An example embodiment detects a type and a location of an object based on a camera image of the object, where the image has a reference frame. Motion of the object is then predicted based on at least one of: the detected type of the object, the detected location of the object, and a model of object motion. To continue, a motion plan for the robot is generated that avoids having the robot collide with the object based on the predicted motion of the object and a transformation between the reference frame of the image and a reference frame of the robot. The robot can be controlled to move in accordance with the motion plan or a signal can be generated that controls the robot to operate in accordance with the motion plan.Type: ApplicationFiled: September 13, 2019Publication date: March 19, 2020Inventors: David M. S. Johnson, Syler Wagner, Steven Lines
-
Publication number: 20200086482Abstract: In an embodiment, a method includes identifying a force and torque for a robot to accomplish a task and identifying context of a portion of a movement plan indicating motion of the robot to perform the task. Based on the identified force, torque, and context, a context specific torque is determined for at least one aspect of the robot while the robot executes the portion of the movement plan. In turn, a control signal is generated for the at least one aspect of the robot to operate in accordance with the determined context specific torque.Type: ApplicationFiled: September 13, 2019Publication date: March 19, 2020Inventors: David M.S. Johnson, Syler Wagner, Steven Lines
-
Publication number: 20200086485Abstract: In an embodiment, a method and system use various sensors to determine a shape of a collection of materials (e.g., foodstuffs). A controller can determine a trajectory which achieves the desired end-state, possibly chosen from a set of feasible, collision-free trajectories to execute, and a robot executes that trajectory. The robot, executing that trajectory, scoops, grabs, or otherwise acquires the desired amount of material from the collection of materials at a desired location. The robot then deposits the collected material in the desired receptacle at a specific location and orientation.Type: ApplicationFiled: September 13, 2019Publication date: March 19, 2020Inventors: David M.S. Johnson, Syler Wagner, Steven Lines
-
Publication number: 20200086498Abstract: In an embodiment, a method during execution of a motion plan by a robotic arm includes determining a voice command from speech of a user said during the execution of the motion plan, determining a modification of the motion plan based on the voice command from the speech of the user, and executing the modification of the motion plan by the robotic arm.Type: ApplicationFiled: September 13, 2019Publication date: March 19, 2020Inventors: David M.S. Johnson, Syler Wagner, Anthony Tayoun, Steven Lines
-
Publication number: 20200090099Abstract: In an embodiment, a method includes determining a given material to manipulate to achieve a goal state. The goal state can be one or more deformable or granular materials in a particular arrangement. The method further includes, for the given material, determining, a respective outcome for each of a plurality of candidate actions to manipulate the given material. The determining can be performed with a physics-based model, in one embodiment. The method further can include determining a given action of the candidate actions, where the outcome of the given action reaching the goal state is within at least one tolerance. The method further includes, based on a selected action of the given actions, generating a first motion plan for the selected action.Type: ApplicationFiled: September 13, 2019Publication date: March 19, 2020Inventors: David M.S. Johnson, Syler Wagner, Steven Lines
-
Publication number: 20200086502Abstract: Robots, including robot arms, can interface with other modules to affect the world surrounding the robot. However, designing modules from scratch when human analogues exist is not efficient. In an embodiment, a mechanical tool, converted from human use, to be used by robots includes a monolithic adaptor having two interface components. The two interface components include a first interface component cabal be of mating with an actuated mechanism on the robot side, the second interface capable of clamping to an existing utensil. In such a way, utensils that are intended for humans can be adapted for robots and robotic arms.Type: ApplicationFiled: September 13, 2019Publication date: March 19, 2020Inventors: David M.S. Johnson, Justin Rooney, Syler Wagner, Steven Lines, Cody Chu, Anthony Tayoun
-
Publication number: 20200086497Abstract: Embodiments provide methods and systems to modify motion of a robot based on sound and context. An embodiment detects a sound in an environment and processes the sound. The processing includes comparing the detected sound to a library of sound characteristics associated with sound cues and/or extracting features or characteristics from the detected sound using a model. Motion of a robot is modified based on a context of the robot and at least one of: (i) the comparison, (ii) the features extracted from the detected sound, and (iii) the characteristics extracted from the detected sound.Type: ApplicationFiled: September 13, 2019Publication date: March 19, 2020Inventors: David M.S. Johnson, Syler Wagner, Anthony Tayoun, Steven Lines