Patents by Inventor Leland Hepler

Leland Hepler has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240192695
    Abstract: Systems and methods are described for the display of a transformed virtual representation of sensor data overlaid on a site model. A system can obtain a site model identifying a site. For example, the site model can include a map, a blueprint, or a graph. The system can obtain sensor data from a sensor of a robot. The sensor data can include route data identifying route waypoints and/or route edges associated with the robot. The system can receive input identifying an association between a virtual representation of the sensor data and the site model. Based on the association, the system can transform the virtual representation of the sensor data and instruct display of the transformed data overlaid on the site model.
    Type: Application
    Filed: December 6, 2023
    Publication date: June 13, 2024
    Inventors: Matthew Jacob Klingensmith, Dom Jonak, Leland Hepler, Christopher Basmajian, Brian Ringley
  • Publication number: 20230418302
    Abstract: A method for online authoring of robot autonomy applications includes receiving sensor data of an environment about a robot while the robot traverses through the environment. The method also includes generating an environmental map representative of the environment about the robot based on the received sensor data. While generating the environmental map, the method includes localizing a current position of the robot within the environmental map and, at each corresponding target location of one or more target locations within the environment, recording a respective action for the robot to perform. The method also includes generating a behavior tree for navigating the robot to each corresponding target location and controlling the robot to perform the respective action at each corresponding target location within the environment during a future mission when the current position of the robot within the environmental map reaches the corresponding target location.
    Type: Application
    Filed: September 13, 2023
    Publication date: December 28, 2023
    Inventors: Samuel Seifert, Leland Hepler
  • Patent number: 11797016
    Abstract: A method for online authoring of robot autonomy applications includes receiving sensor data of an environment about a robot while the robot traverses through the environment. The method also includes generating an environmental map representative of the environment about the robot based on the received sensor data. While generating the environmental map, the method includes localizing a current position of the robot within the environmental map and, at each corresponding target location of one or more target locations within the environment, recording a respective action for the robot to perform. The method also includes generating a behavior tree for navigating the robot to each corresponding target location and controlling the robot to perform the respective action at each corresponding target location within the environment during a future mission when the current position of the robot within the environmental map reaches the corresponding target location.
    Type: Grant
    Filed: May 27, 2020
    Date of Patent: October 24, 2023
    Assignee: Boston Dynamics, Inc.
    Inventors: Samuel Seifert, Leland Hepler
  • Publication number: 20220260998
    Abstract: A method tor controlling a robot includes receiving image data from at least one image sensor. The image data corresponds to an environment about the robot. The method also includes executing a graphical user interface configured to display a scene of the environment based on the image data and receive an input indication indicating selection of a pixel location within the scene. The method also includes determining a pointing vector based on the selection of the pixel location. The pointing vector represents a direction of travel for navigating the robot in the environment. The method also includes transmitting a waypoint command to the robot. The waypoint command when received by the robot causes the robot to navigate to a target location. The target location is based on an intersection between the pointing vector and a terrain estimate of the robot.
    Type: Application
    Filed: May 2, 2022
    Publication date: August 18, 2022
    Applicant: Boston Dynamics, Inc.
    Inventors: Samuel Seifert, Marco Da Silva, Alexander Rice, Leland Hepler, Mario Bollini, Christopher Bentzel
  • Publication number: 20220241980
    Abstract: A method includes receiving sensor data for an environment about the robot. The sensor data is captured by one or more sensors of the robot. The method includes detecting one or more objects in the environment using the received sensor data. For each detected object, the method includes authoring an interaction behavior indicating a behavior that the robot is capable of performing with respect to the corresponding detected object. The method also includes augmenting a localization map of the environment to reflect the respective interaction behavior of each detected object.
    Type: Application
    Filed: January 25, 2022
    Publication date: August 4, 2022
    Applicant: Boston Dynamics, Inc.
    Inventors: Mario Bollini, Leland Hepler
  • Patent number: 11340620
    Abstract: A method for controlling a robot includes receiving image data from at least one image sensor. The image data corresponds to an environment about the robot. The method also includes executing a graphical user interface configured to display a scene of the environment based on the image data and receive an input indication indicating selection of a pixel location within the scene. The method also includes determining a pointing vector based on the selection of the pixel location. The pointing vector represents a direction of travel for navigating the robot in the environment. The method also includes transmitting a waypoint command to the robot. The waypoint command when received by the robot causes the robot to navigate to a target location. The target location is based on an intersection between the pointing vector and a terrain estimate of the robot.
    Type: Grant
    Filed: October 23, 2019
    Date of Patent: May 24, 2022
    Assignee: Boston Dynamics, Inc.
    Inventors: Samuel Seifert, Marco da Silva, Alexander Rice, Leland Hepler, Mario Bollini, Christopher Bentzel
  • Publication number: 20210318687
    Abstract: A method for online authoring of robot autonomy applications includes receiving sensor data of an environment about a robot while the robot traverses through the environment. The method also includes generating an environmental map representative of the environment about the robot based on the received sensor data. While generating the environmental map, the method includes localizing a current position of the robot within the environmental map and, at each corresponding target location of one or more target locations within the environment, recording a respective action for the robot to perform. The method also includes generating a behavior tree for navigating the robot to each corresponding target location and controlling the robot to perform the respective action at each corresponding target location within the environment during a future mission when the current position of the robot within the environmental map reaches the corresponding target location.
    Type: Application
    Filed: May 27, 2020
    Publication date: October 14, 2021
    Applicant: Boston Dynamics, Inc.
    Inventors: Samuel Seifert, Leland Hepler
  • Publication number: 20210041878
    Abstract: A method for controlling a robot includes receiving image data from at least one image sensor. The image data corresponds to an environment about the robot. The method also includes executing a graphical user interface configured to display a scene of the environment based on the image data and receive an input indication indicating selection of a pixel location within the scene. The method also includes determining a pointing vector based on the selection of the pixel location. The pointing vector represents a direction of travel for navigating the robot in the environment. The method also includes transmitting a waypoint command to the robot. The waypoint command when received by the robot causes the robot to navigate to a target location. The target location is based on an intersection between the pointing vector and a terrain estimate of the robot.
    Type: Application
    Filed: October 23, 2019
    Publication date: February 11, 2021
    Applicant: Boston Dynamics, Inc.
    Inventors: Samuel Seifert, Marco da Silva, Alexander Rice, Leland Hepler, Mario Bollini, Christopher Bentzel