Patents by Inventor James Andrew Bagnell

James Andrew Bagnell has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11964663
    Abstract: Determining an instantaneous vehicle characteristic (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined instantaneous vehicle characteristic of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined instantaneous vehicle characteristic of the additional vehicle. In many implementations, the instantaneous vehicle characteristics of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Grant
    Filed: April 11, 2023
    Date of Patent: April 23, 2024
    Assignee: AURORA OPERATIONS, INC.
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Patent number: 11952015
    Abstract: Implementations process, using machine learning (ML) layer(s) of ML model(s), actor(s) from a past episode of locomotion of a vehicle and stream(s) in an environment of the vehicle during the past episode to forecast associated trajectories, for the vehicle and for each of the actor(s), with respect to a respective associated stream of the stream(s). Further, implementations process, using a stream connection function, the associated trajectories to forecast a plurality of associated trajectories, for the vehicle and each of the actor(s), with respect to each of the stream(s). Moreover, implementations iterate between using the ML layer(s) and the stream connection function to update the associated trajectories for the vehicle and each of the actor(s). Implementations subsequently use the ML layer(s) in controlling an AV.
    Type: Grant
    Filed: November 9, 2021
    Date of Patent: April 9, 2024
    Assignee: AURORA OPERATIONS, INC.
    Inventors: James Andrew Bagnell, Sanjiban Choudhury, Venkatraman Narayanan, Arun Venkatraman
  • Patent number: 11933902
    Abstract: Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Grant
    Filed: December 30, 2022
    Date of Patent: March 19, 2024
    Assignee: AURORA OPERATIONS, INC.
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Patent number: 11919529
    Abstract: Techniques are disclosed for evaluating an autonomous vehicle (“AV”) control system by determining deviations between data generated using the AV control system and manual driving data. In many implementations, manual driving data captures action(s) of a vehicle controlled by a manual driver. Additionally or alternatively, multiple AV control systems can be evaluated by comparing deviations for each AV control system, where the deviations are determined using the same set of manual driving data.
    Type: Grant
    Filed: December 29, 2020
    Date of Patent: March 5, 2024
    Assignee: AURORA OPERATIONS, INC.
    Inventors: Arun Venkatraman, James Andrew Bagnell, Haoyang Fan
  • Publication number: 20240043037
    Abstract: Systems and methods related to controlling an autonomous vehicle (“AV”) are described herein. Implementations can obtain a plurality of instances that each include input and output. The input can include actor(s) from a given time instance of a past episode of locomotion of a vehicle, and stream(s) in an environment of the vehicle during the past episode. The actor(s) may be associated with an object in the environment of the vehicle at the given time instance, and the stream(s) may each represent candidate navigation paths in the environment of the vehicle. The output may include ground truth label(s) (or reference label(s)). Implementations can train a machine learning (“ML”) model based on the plurality of instances, and subsequently use the ML model in controlling the AV. In training the ML model, the actor(s) and stream(s) can be processed in parallel.
    Type: Application
    Filed: December 17, 2021
    Publication date: February 8, 2024
    Inventors: James Andrew Bagnell, Arun Venkatraman, Sanjiban Choudhury, Venkatraman Narayanan
  • Patent number: 11859994
    Abstract: Systems and methods for landmark-based localization of an autonomous vehicle (“AV”) are described herein. Implementations can generate a first predicted location of a landmark based on a pose instance of a pose of the AV and a stored location of the landmark, generate a second predicted location of the landmark relative to the AV based on an instance of LIDAR data, generate a correction instance based on the comparing, and use the correction instance in generating additional pose instance(s). Systems and methods for validating localization of a vehicle are also described herein. Implementations can obtain driving data from a past episode of locomotion of the vehicle, generate a pose-based predicted location of a landmark in an environment of the vehicle, and compare the pose-based predicted location to a stored location of the landmark in the environment of the vehicle to validate a pose instance of a pose of the vehicle.
    Type: Grant
    Filed: February 18, 2021
    Date of Patent: January 2, 2024
    Assignee: AURORA INNOVATION, INC.
    Inventors: Yekeun Jeong, Ethan Eade, Adam Richard Williams, Abhay Vardhan, Nicholas George Dilip Roy, James Andrew Bagnell
  • Patent number: 11787439
    Abstract: Example methods for multistage autonomous vehicle motion planning include obtaining sensor data descriptive of an environment of the autonomous vehicle; identifying one or more objects in the environment based on the sensor data; generating a plurality of candidate strategies, wherein each candidate strategy of the plurality of candidate strategies comprises a set of discrete decisions respecting the one or more objects, wherein generating the plurality of candidate strategies includes: determining that at least two strategies satisfy an equivalence criterion, such that the plurality of candidate strategies include at least one candidate strategy corresponding to an equivalence class representative of a plurality of different strategies that are based on different discrete decisions; determining candidate trajectories respectively for the plurality of candidate strategies; and initiating control of the autonomous vehicle based on a selected candidate trajectory.
    Type: Grant
    Filed: November 18, 2022
    Date of Patent: October 17, 2023
    Assignee: AURORA OPERATIONS, INC.
    Inventors: James Andrew Bagnell, Shervin Javdani, Venkatraman Narayanan
  • Patent number: 11782451
    Abstract: Techniques are disclosed for training one or more cost functions of an autonomous vehicle (“AV”) control system based on difference between data generated using the AV control system and manual driving data. In many implementations, manual driving data captures action(s) of a vehicle controlled by a manual driver. Additionally or alternatively, multiple AV control systems can be evaluated by comparing deviations for each AV control system, where the deviations are determined using the same set of manual driving data.
    Type: Grant
    Filed: December 29, 2020
    Date of Patent: October 10, 2023
    Assignee: AURORA OPERATIONS, INC.
    Inventors: Arun Venkatraman, James Andrew Bagnell
  • Publication number: 20230271615
    Abstract: Determining an instantaneous vehicle characteristic (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined instantaneous vehicle characteristic of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined instantaneous vehicle characteristic of the additional vehicle. In many implementations, the instantaneous vehicle characteristics of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Application
    Filed: April 11, 2023
    Publication date: August 31, 2023
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Patent number: 11654917
    Abstract: Determining yaw parameter(s) (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined yaw parameter(s) of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined yaw rate of the additional vehicle. In many implementations, the yaw parameter(s) of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Grant
    Filed: December 28, 2020
    Date of Patent: May 23, 2023
    Assignee: AURORA OPERATIONS, INC.
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Publication number: 20230145236
    Abstract: Implementations process, using machine learning (ML) layer(s) of ML model(s), actor(s) from a past episode of locomotion of a vehicle and stream(s) in an environment of the vehicle during the past episode to forecast associated trajectories, for the vehicle and for each of the actor(s), with respect to a respective associated stream of the stream(s). Further, implementations process, using a stream connection function, the associated trajectories to forecast a plurality of associated trajectories, for the vehicle and each of the actor(s), with respect to each of the stream(s). Moreover, implementations iterate between using the ML layer(s) and the stream connection function to update the associated trajectories for the vehicle and each of the actor(s). Implementations subsequently use the ML layer(s) in controlling an AV.
    Type: Application
    Filed: November 9, 2021
    Publication date: May 11, 2023
    Inventors: James Andrew Bagnell, Sanjiban Choudhury, Venkatraman Narayanan, Arun Venkatraman
  • Publication number: 20230133611
    Abstract: Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Application
    Filed: December 30, 2022
    Publication date: May 4, 2023
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Patent number: 11550061
    Abstract: Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Grant
    Filed: October 29, 2018
    Date of Patent: January 10, 2023
    Assignee: Aurora Operations, Inc.
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Patent number: 11526538
    Abstract: A relative atlas graph maintains mapping data used by an autonomous vehicle. The relative atlas graph may be generated for a geographical area based on observations collected from the geographical area, and may include element nodes corresponding to elements detected from the observations along with edges that connect pairs of element nodes and define relative poses between the elements for connected pairs of element nodes, as well as relations that connect multiple element nodes to define logical relationships therebetween.
    Type: Grant
    Filed: February 17, 2022
    Date of Patent: December 13, 2022
    Assignee: Aurora Operations, Inc.
    Inventors: Ethan Eade, Michael Bode, James Andrew Bagnell
  • Patent number: 11358601
    Abstract: Various implementations described herein generate training instances that each include corresponding training instance input that is based on corresponding sensor data of a corresponding autonomous vehicle, and that include corresponding training instance output that is based on corresponding sensor data of a corresponding additional vehicle, where the corresponding additional vehicle is captured at least in part by the corresponding sensor data of the corresponding autonomous vehicle. Various implementations train a machine learning model based on such training instances. Once trained, the machine learning model can enable processing, using the machine learning model, of sensor data from a given autonomous vehicle to predict one or more properties of a given additional vehicle that is captured at least in part by the sensor data.
    Type: Grant
    Filed: May 7, 2020
    Date of Patent: June 14, 2022
    Assignee: Aurora Operations, Inc.
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Publication number: 20220171797
    Abstract: A relative atlas graph maintains mapping data used by an autonomous vehicle. The relative atlas graph may be generated for a geographical area based on observations collected from the geographical area, and may include element nodes corresponding to elements detected from the observations along with edges that connect pairs of element nodes and define relative poses between the elements for connected pairs of element nodes, as well as relations that connect multiple element nodes to define logical relationships therebetween.
    Type: Application
    Filed: February 17, 2022
    Publication date: June 2, 2022
    Inventors: Ethan Eade, Michael Bode, James Andrew Bagnell
  • Patent number: 11256730
    Abstract: A relative atlas may be used to lay out elements in a digital map used in the control of an autonomous vehicle. A vehicle pose for the autonomous vehicle within a geographical area may be determined, and the relative atlas may be accessed to identify elements in the geographical area and to determine relative poses between those elements. The elements may then be laid out within the digital map using the determined relative poses, e.g., for use in planning vehicle trajectories, for estimating the states of traffic controls, or for tracking and/or identifying dynamic objects, among other purposes.
    Type: Grant
    Filed: December 6, 2019
    Date of Patent: February 22, 2022
    Assignee: Aurora Operations, Inc.
    Inventors: Ethan Eade, Michael Bode, James Andrew Bagnell
  • Patent number: 11256729
    Abstract: A relative atlas graph maintains mapping data used by an autonomous vehicle. The relative atlas graph may be generated for a geographical area based on observations collected from the geographical area, and may include element nodes corresponding to elements detected from the observations along with edges that connect pairs of element nodes and define relative poses between the elements for connected pairs of element nodes, as well as relations that connect multiple element nodes to define logical relationships therebetween.
    Type: Grant
    Filed: September 27, 2019
    Date of Patent: February 22, 2022
    Assignee: Aurora Operations, Inc.
    Inventors: Ethan Eade, Michael Bode, James Andrew Bagnell
  • Patent number: 11134617
    Abstract: A self-guided blossom picker uses a vision system to identify and locate blossoms or inflorescence growing on a plant. The device can be towed by a tractor or it can be self-propelled. Image data captured by the vision system is sent to an machine vision module, which interprets the data and identifies a location of blossom. A controller uses the location data to command a picker to the proper location. A cutter on the picker is actuated to remove the blossom.
    Type: Grant
    Filed: September 19, 2016
    Date of Patent: October 5, 2021
    Assignee: Carnegie Mellon University
    Inventors: Herman Herman, Christopher Chandler Fromme, Elliot Allen Cuzzillo, Jaime W. Bourne, Richard D. Pantaleo, Neil Frederick Stegall, James Andrew Bagnell, Jeffrey David McMahill, Joan Campoy
  • Publication number: 20210146932
    Abstract: Determining yaw parameter(s) (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined yaw parameter(s) of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined yaw rate of the additional vehicle. In many implementations, the yaw parameter(s) of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Application
    Filed: December 28, 2020
    Publication date: May 20, 2021
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson