Patents by Inventor Samuel Seifert

Samuel Seifert has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230418302
    Abstract: A method for online authoring of robot autonomy applications includes receiving sensor data of an environment about a robot while the robot traverses through the environment. The method also includes generating an environmental map representative of the environment about the robot based on the received sensor data. While generating the environmental map, the method includes localizing a current position of the robot within the environmental map and, at each corresponding target location of one or more target locations within the environment, recording a respective action for the robot to perform. The method also includes generating a behavior tree for navigating the robot to each corresponding target location and controlling the robot to perform the respective action at each corresponding target location within the environment during a future mission when the current position of the robot within the environmental map reaches the corresponding target location.
    Type: Application
    Filed: September 13, 2023
    Publication date: December 28, 2023
    Inventors: Samuel Seifert, Leland Hepler
  • Patent number: 11797016
    Abstract: A method for online authoring of robot autonomy applications includes receiving sensor data of an environment about a robot while the robot traverses through the environment. The method also includes generating an environmental map representative of the environment about the robot based on the received sensor data. While generating the environmental map, the method includes localizing a current position of the robot within the environmental map and, at each corresponding target location of one or more target locations within the environment, recording a respective action for the robot to perform. The method also includes generating a behavior tree for navigating the robot to each corresponding target location and controlling the robot to perform the respective action at each corresponding target location within the environment during a future mission when the current position of the robot within the environmental map reaches the corresponding target location.
    Type: Grant
    Filed: May 27, 2020
    Date of Patent: October 24, 2023
    Assignee: Boston Dynamics, Inc.
    Inventors: Samuel Seifert, Leland Hepler
  • Publication number: 20220260998
    Abstract: A method tor controlling a robot includes receiving image data from at least one image sensor. The image data corresponds to an environment about the robot. The method also includes executing a graphical user interface configured to display a scene of the environment based on the image data and receive an input indication indicating selection of a pixel location within the scene. The method also includes determining a pointing vector based on the selection of the pixel location. The pointing vector represents a direction of travel for navigating the robot in the environment. The method also includes transmitting a waypoint command to the robot. The waypoint command when received by the robot causes the robot to navigate to a target location. The target location is based on an intersection between the pointing vector and a terrain estimate of the robot.
    Type: Application
    Filed: May 2, 2022
    Publication date: August 18, 2022
    Applicant: Boston Dynamics, Inc.
    Inventors: Samuel Seifert, Marco Da Silva, Alexander Rice, Leland Hepler, Mario Bollini, Christopher Bentzel
  • Publication number: 20220244741
    Abstract: A method includes receiving, while a robot traverses a building environment, sensor data captured by one or more sensors of the robot. The method includes receiving a building information model (BIM) for the environment that includes semantic information identifying one or more permanent objects within the environment. The method includes generating a plurality of localization candidates for a localization map of the environment. Each localization candidate corresponds to a feature of the environment identified by the sensor data and represents a potential localization reference point. The localization map is configured to localize the robot within the environment when the robot moves throughout the environment.
    Type: Application
    Filed: January 26, 2022
    Publication date: August 4, 2022
    Applicant: Boston Dynamics, Inc.
    Inventors: Marco da Silva, Dom Jonak, Matthew Klingensmith, Samuel Seifert
  • Patent number: 11340620
    Abstract: A method for controlling a robot includes receiving image data from at least one image sensor. The image data corresponds to an environment about the robot. The method also includes executing a graphical user interface configured to display a scene of the environment based on the image data and receive an input indication indicating selection of a pixel location within the scene. The method also includes determining a pointing vector based on the selection of the pixel location. The pointing vector represents a direction of travel for navigating the robot in the environment. The method also includes transmitting a waypoint command to the robot. The waypoint command when received by the robot causes the robot to navigate to a target location. The target location is based on an intersection between the pointing vector and a terrain estimate of the robot.
    Type: Grant
    Filed: October 23, 2019
    Date of Patent: May 24, 2022
    Assignee: Boston Dynamics, Inc.
    Inventors: Samuel Seifert, Marco da Silva, Alexander Rice, Leland Hepler, Mario Bollini, Christopher Bentzel
  • Publication number: 20210318687
    Abstract: A method for online authoring of robot autonomy applications includes receiving sensor data of an environment about a robot while the robot traverses through the environment. The method also includes generating an environmental map representative of the environment about the robot based on the received sensor data. While generating the environmental map, the method includes localizing a current position of the robot within the environmental map and, at each corresponding target location of one or more target locations within the environment, recording a respective action for the robot to perform. The method also includes generating a behavior tree for navigating the robot to each corresponding target location and controlling the robot to perform the respective action at each corresponding target location within the environment during a future mission when the current position of the robot within the environmental map reaches the corresponding target location.
    Type: Application
    Filed: May 27, 2020
    Publication date: October 14, 2021
    Applicant: Boston Dynamics, Inc.
    Inventors: Samuel Seifert, Leland Hepler
  • Publication number: 20210041878
    Abstract: A method for controlling a robot includes receiving image data from at least one image sensor. The image data corresponds to an environment about the robot. The method also includes executing a graphical user interface configured to display a scene of the environment based on the image data and receive an input indication indicating selection of a pixel location within the scene. The method also includes determining a pointing vector based on the selection of the pixel location. The pointing vector represents a direction of travel for navigating the robot in the environment. The method also includes transmitting a waypoint command to the robot. The waypoint command when received by the robot causes the robot to navigate to a target location. The target location is based on an intersection between the pointing vector and a terrain estimate of the robot.
    Type: Application
    Filed: October 23, 2019
    Publication date: February 11, 2021
    Applicant: Boston Dynamics, Inc.
    Inventors: Samuel Seifert, Marco da Silva, Alexander Rice, Leland Hepler, Mario Bollini, Christopher Bentzel