Patents by Inventor Ethan Eade

Ethan Eade has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11360731
    Abstract: A computing device and method are provided for transmitting a relevant subset of map data, called a neighborhood, to enable mutual spatial understanding by multiple display devices around a target virtual location to display a shared hologram in the same exact location in the physical environment at the same moment in time. The computing device may comprise a processor, a memory operatively coupled to the processor, and an anchor transfer program stored in the memory and executed by the processor.
    Type: Grant
    Filed: October 11, 2019
    Date of Patent: June 14, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Ethan Eade, Jeroen Vanturennout, Jonathan Lyons, David Fields, Gavin Dean Lazarow, Tushar Cyril Bhatnagar
  • Patent number: 11358601
    Abstract: Various implementations described herein generate training instances that each include corresponding training instance input that is based on corresponding sensor data of a corresponding autonomous vehicle, and that include corresponding training instance output that is based on corresponding sensor data of a corresponding additional vehicle, where the corresponding additional vehicle is captured at least in part by the corresponding sensor data of the corresponding autonomous vehicle. Various implementations train a machine learning model based on such training instances. Once trained, the machine learning model can enable processing, using the machine learning model, of sensor data from a given autonomous vehicle to predict one or more properties of a given additional vehicle that is captured at least in part by the sensor data.
    Type: Grant
    Filed: May 7, 2020
    Date of Patent: June 14, 2022
    Assignee: Aurora Operations, Inc.
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Patent number: 11353867
    Abstract: An autonomous vehicle uses a secondary vehicle control system to supplement a primary vehicle control system to perform a controlled stop if an adverse event is detected in the primary vehicle control system. The secondary vehicle control system may use a redundant lateral velocity determined by a different sensor from that used by the primary vehicle control system to determine lateral velocity for use in controlling the autonomous vehicle to perform the controlled stop.
    Type: Grant
    Filed: June 10, 2020
    Date of Patent: June 7, 2022
    Assignee: Aurora Operations, Inc.
    Inventors: Ethan Eade, Nathaniel Gist, IV, Thomas Pilarski
  • Publication number: 20220171797
    Abstract: A relative atlas graph maintains mapping data used by an autonomous vehicle. The relative atlas graph may be generated for a geographical area based on observations collected from the geographical area, and may include element nodes corresponding to elements detected from the observations along with edges that connect pairs of element nodes and define relative poses between the elements for connected pairs of element nodes, as well as relations that connect multiple element nodes to define logical relationships therebetween.
    Type: Application
    Filed: February 17, 2022
    Publication date: June 2, 2022
    Inventors: Ethan Eade, Michael Bode, James Andrew Bagnell
  • Publication number: 20220172371
    Abstract: A relative atlas graph is generated to store mapping data used by an autonomous vehicle. The relative atlas graph may be generated for a geographical area based on observations collected from the geographical area, and may include element nodes corresponding to elements detected from the observations along with edges that connect pairs of element nodes and define relative poses between the elements for connected pairs of element nodes.
    Type: Application
    Filed: February 17, 2022
    Publication date: June 2, 2022
    Inventors: Ethan Eade, Michael Bode
  • Publication number: 20220107641
    Abstract: Sensor data collected from an autonomous vehicle data can be labeled using sensor data collected from an additional vehicle. The additional vehicle can include a non-autonomous vehicle mounted with a removable hardware pod. In many implementations, removable hardware pods can be vehicle agnostic. In many implementations, generated labels can be utilized to train a machine learning model which can generate one or more control signals for the autonomous vehicle.
    Type: Application
    Filed: December 16, 2021
    Publication date: April 7, 2022
    Inventors: Jean-Sebastien Valois, Ethan Eade
  • Patent number: 11257218
    Abstract: A relative atlas graph is generated to store mapping data used by an autonomous vehicle. The relative atlas graph may be generated for a geographical area based on observations collected from the geographical area, and may include element nodes corresponding to elements detected from the observations along with edges that connect pairs of element nodes and define relative poses between the elements for connected pairs of element nodes.
    Type: Grant
    Filed: December 6, 2019
    Date of Patent: February 22, 2022
    Assignee: Aurora Operations, Inc.
    Inventors: Ethan Eade, Michael Bode
  • Patent number: 11256729
    Abstract: A relative atlas graph maintains mapping data used by an autonomous vehicle. The relative atlas graph may be generated for a geographical area based on observations collected from the geographical area, and may include element nodes corresponding to elements detected from the observations along with edges that connect pairs of element nodes and define relative poses between the elements for connected pairs of element nodes, as well as relations that connect multiple element nodes to define logical relationships therebetween.
    Type: Grant
    Filed: September 27, 2019
    Date of Patent: February 22, 2022
    Assignee: Aurora Operations, Inc.
    Inventors: Ethan Eade, Michael Bode, James Andrew Bagnell
  • Patent number: 11256730
    Abstract: A relative atlas may be used to lay out elements in a digital map used in the control of an autonomous vehicle. A vehicle pose for the autonomous vehicle within a geographical area may be determined, and the relative atlas may be accessed to identify elements in the geographical area and to determine relative poses between those elements. The elements may then be laid out within the digital map using the determined relative poses, e.g., for use in planning vehicle trajectories, for estimating the states of traffic controls, or for tracking and/or identifying dynamic objects, among other purposes.
    Type: Grant
    Filed: December 6, 2019
    Date of Patent: February 22, 2022
    Assignee: Aurora Operations, Inc.
    Inventors: Ethan Eade, Michael Bode, James Andrew Bagnell
  • Patent number: 11209821
    Abstract: Sensor data collected from an autonomous vehicle data can be labeled using sensor data collected from an additional vehicle. The additional vehicle can include a non-autonomous vehicle mounted with a removable hardware pod. In many implementations, removable hardware pods can be vehicle agnostic. In many implementations, generated labels can be utilized to train a machine learning model which can generate one or more control signals for the autonomous vehicle.
    Type: Grant
    Filed: February 8, 2019
    Date of Patent: December 28, 2021
    Assignee: Aurora Operations, Inc.
    Inventors: Jean-Sebastien Valois, Ethan Eade
  • Publication number: 20210221003
    Abstract: Apparatus and methods for carpet drift estimation are disclosed. In certain implementations, a robotic device includes an actuator system to move the body across a surface. A first set of sensors can sense an actuation characteristic of the actuator system. For example, the first set of sensors can include odometry sensors for sensing wheel rotations of the actuator system. A second set of sensors can sense a motion characteristic of the body. The first set of sensors may be a different type of sensor than the second set of sensors. A controller can estimate carpet drift based at least on the actuation characteristic sensed by the first set of sensors and the motion characteristic sensed by the second set of sensors.
    Type: Application
    Filed: April 6, 2021
    Publication date: July 22, 2021
    Inventors: Dhiraj Goel, Ethan Eade, Philip Fong, Mario E. Munich
  • Publication number: 20210146932
    Abstract: Determining yaw parameter(s) (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined yaw parameter(s) of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined yaw rate of the additional vehicle. In many implementations, the yaw parameter(s) of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Application
    Filed: December 28, 2020
    Publication date: May 20, 2021
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Patent number: 10974391
    Abstract: Apparatus and methods for carpet drift estimation are disclosed. In certain implementations, a robotic device includes an actuator system to move the body across a surface. A first set of sensors can sense an actuation characteristic of the actuator system. For example, the first set of sensors can include odometry sensors for sensing wheel rotations of the actuator system. A second set of sensors can sense a motion characteristic of the body. The first set of sensors may be a different type of sensor than the second set of sensors. A controller can estimate carpet drift based at least on the actuation characteristic sensed by the first set of sensors and the motion characteristic sensed by the second set of sensors.
    Type: Grant
    Filed: April 10, 2018
    Date of Patent: April 13, 2021
    Assignee: iRobot Corporation
    Inventors: Dhiraj Goel, Ethan Eade, Philip Fong, Mario E. Munich
  • Patent number: 10976410
    Abstract: A method includes obtaining a first track associated with a first time. A first track associated with a first time is obtained. First predicted state data associated with a second time that is later than the first time, are generated based on the first track. Radar measurement data associated with the second time are obtained from one or more radar sensors. Track data are generated by a machine learning model based on the first predicted state data and the radar measurement data. Second predicted state data associated with the second time are generated based on the first track. A second track associated with the second time is generated based on the track data and the second predicted state data. The second track associated with the second time is provided to an autonomous vehicle control system for autonomous control of a vehicle.
    Type: Grant
    Filed: June 26, 2020
    Date of Patent: April 13, 2021
    Assignee: AURORA INNOVATION, INC.
    Inventors: Shaogang Wang, Ethan Eade, Warren Smith
  • Patent number: 10962376
    Abstract: A system and method for mapping parameter data acquired by a robot mapping system is disclosed. Parameter data characterizing the environment is collected while the robot localizes itself within the environment using landmarks. Parameter data is recorded in a plurality of local grids, i.e., sub-maps associated with the robot position and orientation when the data was collected. The robot is configured to generate new grids or reuse existing grids depending on the robot's current pose, the pose associated with other grids, and the uncertainty of these relative pose estimates. The pose estimates associated with the grids are updated over time as the robot refines its estimates of the locations of landmarks from which determines its pose in the environment. Occupancy maps or other global parameter maps may be generated by rendering local grids into a comprehensive map indicating the parameter data in a global reference frame extending the dimensions of the environment.
    Type: Grant
    Filed: March 14, 2018
    Date of Patent: March 30, 2021
    Assignee: iRobot Corporation
    Inventors: Philip Fong, Ethan Eade, Mario E. Munich
  • Patent number: 10906536
    Abstract: Determining yaw parameter(s) (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined yaw parameter(s) of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined yaw rate of the additional vehicle. In many implementations, the yaw parameter(s) of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Grant
    Filed: October 29, 2018
    Date of Patent: February 2, 2021
    Assignee: Aurora Innovation, Inc.
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Publication number: 20200391736
    Abstract: Various implementations described herein generate training instances that each include corresponding training instance input that is based on corresponding sensor data of a corresponding autonomous vehicle, and that include corresponding training instance output that is based on corresponding sensor data of a corresponding additional vehicle, where the corresponding additional vehicle is captured at least in part by the corresponding sensor data of the corresponding autonomous vehicle. Various implementations train a machine learning model based on such training instances. Once trained, the machine learning model can enable processing, using the machine learning model, of sensor data from a given autonomous vehicle to predict one or more properties of a given additional vehicle that is captured at least in part by the sensor data.
    Type: Application
    Filed: May 7, 2020
    Publication date: December 17, 2020
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Patent number: 10775804
    Abstract: A downwardly-directed optical array sensor may be used in an autonomous vehicle to enable a velocity (e.g., an overall velocity having a direction and magnitude, or a velocity in a particular direction, e.g., along a longitudinal or lateral axis of a vehicle) to be determined based upon images of a ground or driving surface captured from multiple downwardly-directed optical sensors having different respective fields of view.
    Type: Grant
    Filed: April 10, 2018
    Date of Patent: September 15, 2020
    Assignee: Aurora Innovation, Inc.
    Inventors: Ethan Eade, Nathaniel Gist, IV, Thomas Pilarski
  • Patent number: 10747223
    Abstract: An autonomous vehicle uses a secondary vehicle control system to supplement a primary vehicle control system to perform a controlled stop if an adverse event is detected in the primary vehicle control system. The secondary vehicle control system may use a redundant lateral velocity determined by a different sensor from that used by the primary vehicle control system to determine lateral velocity for use in controlling the autonomous vehicle to perform the controlled stop.
    Type: Grant
    Filed: April 10, 2018
    Date of Patent: August 18, 2020
    Assignee: Aurora Innovation, Inc.
    Inventors: Ethan Eade, Nathaniel Gist, IV, Thomas Pilarski
  • Patent number: 10732261
    Abstract: A method includes obtaining a first track associated with a first time. A first track associated with a first time is obtained. First predicted state data associated with a second time that is later than the first time, are generated based on the first track. Radar measurement data associated with the second time are obtained from one or more radar sensors. Track data are generated by a machine learning model based on the first predicted state data and the radar measurement data. Second predicted state data associated with the second time are generated based on the first track. A second track associated with the second time is generated based on the track data and the second predicted state data. The second track associated with the second time is provided to an autonomous vehicle control system for autonomous control of a vehicle.
    Type: Grant
    Filed: December 31, 2019
    Date of Patent: August 4, 2020
    Assignee: AURORA INNOVATION, INC.
    Inventors: Shaogang Wang, Ethan Eade, Warren Smith