Patents by Inventor James Andrew Bagnell
James Andrew Bagnell has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11964663Abstract: Determining an instantaneous vehicle characteristic (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined instantaneous vehicle characteristic of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined instantaneous vehicle characteristic of the additional vehicle. In many implementations, the instantaneous vehicle characteristics of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.Type: GrantFiled: April 11, 2023Date of Patent: April 23, 2024Assignee: AURORA OPERATIONS, INC.Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Patent number: 11952015Abstract: Implementations process, using machine learning (ML) layer(s) of ML model(s), actor(s) from a past episode of locomotion of a vehicle and stream(s) in an environment of the vehicle during the past episode to forecast associated trajectories, for the vehicle and for each of the actor(s), with respect to a respective associated stream of the stream(s). Further, implementations process, using a stream connection function, the associated trajectories to forecast a plurality of associated trajectories, for the vehicle and each of the actor(s), with respect to each of the stream(s). Moreover, implementations iterate between using the ML layer(s) and the stream connection function to update the associated trajectories for the vehicle and each of the actor(s). Implementations subsequently use the ML layer(s) in controlling an AV.Type: GrantFiled: November 9, 2021Date of Patent: April 9, 2024Assignee: AURORA OPERATIONS, INC.Inventors: James Andrew Bagnell, Sanjiban Choudhury, Venkatraman Narayanan, Arun Venkatraman
-
Patent number: 11933902Abstract: Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.Type: GrantFiled: December 30, 2022Date of Patent: March 19, 2024Assignee: AURORA OPERATIONS, INC.Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Patent number: 11919529Abstract: Techniques are disclosed for evaluating an autonomous vehicle (“AV”) control system by determining deviations between data generated using the AV control system and manual driving data. In many implementations, manual driving data captures action(s) of a vehicle controlled by a manual driver. Additionally or alternatively, multiple AV control systems can be evaluated by comparing deviations for each AV control system, where the deviations are determined using the same set of manual driving data.Type: GrantFiled: December 29, 2020Date of Patent: March 5, 2024Assignee: AURORA OPERATIONS, INC.Inventors: Arun Venkatraman, James Andrew Bagnell, Haoyang Fan
-
Publication number: 20240043037Abstract: Systems and methods related to controlling an autonomous vehicle (“AV”) are described herein. Implementations can obtain a plurality of instances that each include input and output. The input can include actor(s) from a given time instance of a past episode of locomotion of a vehicle, and stream(s) in an environment of the vehicle during the past episode. The actor(s) may be associated with an object in the environment of the vehicle at the given time instance, and the stream(s) may each represent candidate navigation paths in the environment of the vehicle. The output may include ground truth label(s) (or reference label(s)). Implementations can train a machine learning (“ML”) model based on the plurality of instances, and subsequently use the ML model in controlling the AV. In training the ML model, the actor(s) and stream(s) can be processed in parallel.Type: ApplicationFiled: December 17, 2021Publication date: February 8, 2024Inventors: James Andrew Bagnell, Arun Venkatraman, Sanjiban Choudhury, Venkatraman Narayanan
-
Patent number: 11859994Abstract: Systems and methods for landmark-based localization of an autonomous vehicle (“AV”) are described herein. Implementations can generate a first predicted location of a landmark based on a pose instance of a pose of the AV and a stored location of the landmark, generate a second predicted location of the landmark relative to the AV based on an instance of LIDAR data, generate a correction instance based on the comparing, and use the correction instance in generating additional pose instance(s). Systems and methods for validating localization of a vehicle are also described herein. Implementations can obtain driving data from a past episode of locomotion of the vehicle, generate a pose-based predicted location of a landmark in an environment of the vehicle, and compare the pose-based predicted location to a stored location of the landmark in the environment of the vehicle to validate a pose instance of a pose of the vehicle.Type: GrantFiled: February 18, 2021Date of Patent: January 2, 2024Assignee: AURORA INNOVATION, INC.Inventors: Yekeun Jeong, Ethan Eade, Adam Richard Williams, Abhay Vardhan, Nicholas George Dilip Roy, James Andrew Bagnell
-
Patent number: 11787439Abstract: Example methods for multistage autonomous vehicle motion planning include obtaining sensor data descriptive of an environment of the autonomous vehicle; identifying one or more objects in the environment based on the sensor data; generating a plurality of candidate strategies, wherein each candidate strategy of the plurality of candidate strategies comprises a set of discrete decisions respecting the one or more objects, wherein generating the plurality of candidate strategies includes: determining that at least two strategies satisfy an equivalence criterion, such that the plurality of candidate strategies include at least one candidate strategy corresponding to an equivalence class representative of a plurality of different strategies that are based on different discrete decisions; determining candidate trajectories respectively for the plurality of candidate strategies; and initiating control of the autonomous vehicle based on a selected candidate trajectory.Type: GrantFiled: November 18, 2022Date of Patent: October 17, 2023Assignee: AURORA OPERATIONS, INC.Inventors: James Andrew Bagnell, Shervin Javdani, Venkatraman Narayanan
-
Patent number: 11782451Abstract: Techniques are disclosed for training one or more cost functions of an autonomous vehicle (“AV”) control system based on difference between data generated using the AV control system and manual driving data. In many implementations, manual driving data captures action(s) of a vehicle controlled by a manual driver. Additionally or alternatively, multiple AV control systems can be evaluated by comparing deviations for each AV control system, where the deviations are determined using the same set of manual driving data.Type: GrantFiled: December 29, 2020Date of Patent: October 10, 2023Assignee: AURORA OPERATIONS, INC.Inventors: Arun Venkatraman, James Andrew Bagnell
-
Publication number: 20230271615Abstract: Determining an instantaneous vehicle characteristic (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined instantaneous vehicle characteristic of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined instantaneous vehicle characteristic of the additional vehicle. In many implementations, the instantaneous vehicle characteristics of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.Type: ApplicationFiled: April 11, 2023Publication date: August 31, 2023Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Patent number: 11654917Abstract: Determining yaw parameter(s) (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined yaw parameter(s) of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined yaw rate of the additional vehicle. In many implementations, the yaw parameter(s) of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.Type: GrantFiled: December 28, 2020Date of Patent: May 23, 2023Assignee: AURORA OPERATIONS, INC.Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Publication number: 20230145236Abstract: Implementations process, using machine learning (ML) layer(s) of ML model(s), actor(s) from a past episode of locomotion of a vehicle and stream(s) in an environment of the vehicle during the past episode to forecast associated trajectories, for the vehicle and for each of the actor(s), with respect to a respective associated stream of the stream(s). Further, implementations process, using a stream connection function, the associated trajectories to forecast a plurality of associated trajectories, for the vehicle and each of the actor(s), with respect to each of the stream(s). Moreover, implementations iterate between using the ML layer(s) and the stream connection function to update the associated trajectories for the vehicle and each of the actor(s). Implementations subsequently use the ML layer(s) in controlling an AV.Type: ApplicationFiled: November 9, 2021Publication date: May 11, 2023Inventors: James Andrew Bagnell, Sanjiban Choudhury, Venkatraman Narayanan, Arun Venkatraman
-
Publication number: 20230133611Abstract: Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.Type: ApplicationFiled: December 30, 2022Publication date: May 4, 2023Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Patent number: 11550061Abstract: Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.Type: GrantFiled: October 29, 2018Date of Patent: January 10, 2023Assignee: Aurora Operations, Inc.Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Patent number: 11526538Abstract: A relative atlas graph maintains mapping data used by an autonomous vehicle. The relative atlas graph may be generated for a geographical area based on observations collected from the geographical area, and may include element nodes corresponding to elements detected from the observations along with edges that connect pairs of element nodes and define relative poses between the elements for connected pairs of element nodes, as well as relations that connect multiple element nodes to define logical relationships therebetween.Type: GrantFiled: February 17, 2022Date of Patent: December 13, 2022Assignee: Aurora Operations, Inc.Inventors: Ethan Eade, Michael Bode, James Andrew Bagnell
-
Patent number: 11358601Abstract: Various implementations described herein generate training instances that each include corresponding training instance input that is based on corresponding sensor data of a corresponding autonomous vehicle, and that include corresponding training instance output that is based on corresponding sensor data of a corresponding additional vehicle, where the corresponding additional vehicle is captured at least in part by the corresponding sensor data of the corresponding autonomous vehicle. Various implementations train a machine learning model based on such training instances. Once trained, the machine learning model can enable processing, using the machine learning model, of sensor data from a given autonomous vehicle to predict one or more properties of a given additional vehicle that is captured at least in part by the sensor data.Type: GrantFiled: May 7, 2020Date of Patent: June 14, 2022Assignee: Aurora Operations, Inc.Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Publication number: 20220171797Abstract: A relative atlas graph maintains mapping data used by an autonomous vehicle. The relative atlas graph may be generated for a geographical area based on observations collected from the geographical area, and may include element nodes corresponding to elements detected from the observations along with edges that connect pairs of element nodes and define relative poses between the elements for connected pairs of element nodes, as well as relations that connect multiple element nodes to define logical relationships therebetween.Type: ApplicationFiled: February 17, 2022Publication date: June 2, 2022Inventors: Ethan Eade, Michael Bode, James Andrew Bagnell
-
Patent number: 11256730Abstract: A relative atlas may be used to lay out elements in a digital map used in the control of an autonomous vehicle. A vehicle pose for the autonomous vehicle within a geographical area may be determined, and the relative atlas may be accessed to identify elements in the geographical area and to determine relative poses between those elements. The elements may then be laid out within the digital map using the determined relative poses, e.g., for use in planning vehicle trajectories, for estimating the states of traffic controls, or for tracking and/or identifying dynamic objects, among other purposes.Type: GrantFiled: December 6, 2019Date of Patent: February 22, 2022Assignee: Aurora Operations, Inc.Inventors: Ethan Eade, Michael Bode, James Andrew Bagnell
-
Patent number: 11256729Abstract: A relative atlas graph maintains mapping data used by an autonomous vehicle. The relative atlas graph may be generated for a geographical area based on observations collected from the geographical area, and may include element nodes corresponding to elements detected from the observations along with edges that connect pairs of element nodes and define relative poses between the elements for connected pairs of element nodes, as well as relations that connect multiple element nodes to define logical relationships therebetween.Type: GrantFiled: September 27, 2019Date of Patent: February 22, 2022Assignee: Aurora Operations, Inc.Inventors: Ethan Eade, Michael Bode, James Andrew Bagnell
-
Patent number: 11134617Abstract: A self-guided blossom picker uses a vision system to identify and locate blossoms or inflorescence growing on a plant. The device can be towed by a tractor or it can be self-propelled. Image data captured by the vision system is sent to an machine vision module, which interprets the data and identifies a location of blossom. A controller uses the location data to command a picker to the proper location. A cutter on the picker is actuated to remove the blossom.Type: GrantFiled: September 19, 2016Date of Patent: October 5, 2021Assignee: Carnegie Mellon UniversityInventors: Herman Herman, Christopher Chandler Fromme, Elliot Allen Cuzzillo, Jaime W. Bourne, Richard D. Pantaleo, Neil Frederick Stegall, James Andrew Bagnell, Jeffrey David McMahill, Joan Campoy
-
Publication number: 20210146932Abstract: Determining yaw parameter(s) (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined yaw parameter(s) of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined yaw rate of the additional vehicle. In many implementations, the yaw parameter(s) of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.Type: ApplicationFiled: December 28, 2020Publication date: May 20, 2021Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson