Patents by Inventor Bartholomeus C. Nabbe
Bartholomeus C. Nabbe has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11964663Abstract: Determining an instantaneous vehicle characteristic (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined instantaneous vehicle characteristic of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined instantaneous vehicle characteristic of the additional vehicle. In many implementations, the instantaneous vehicle characteristics of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.Type: GrantFiled: April 11, 2023Date of Patent: April 23, 2024Assignee: AURORA OPERATIONS, INC.Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Patent number: 11933902Abstract: Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.Type: GrantFiled: December 30, 2022Date of Patent: March 19, 2024Assignee: AURORA OPERATIONS, INC.Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Patent number: 11858459Abstract: Some embodiments provide a vehicle navigation system which can navigate a vehicle through an environment based on driving commands received from a remote control system based on manual operator interaction with an interface of the remote control system. Remote driving control can be engaged based on determination, via processing vehicle sensor data, of a health emergency associated with one or more occupants of the vehicle, and the remote control system can generate remote driving commands which cause the vehicle to be navigated to a particular location without requiring the occupant associated with the health emergency to manually navigate the vehicle. The remote control system can monitor the occupant via communicated vehicle sensor data and can control remote control devices included in the vehicle to provide external indication that the vehicle is being navigated according to remote driving control.Type: GrantFiled: May 17, 2019Date of Patent: January 2, 2024Assignee: Apple Inc.Inventors: Bartholomeus C. Nabbe, Tie-Qi Chen, Benjamin B. Lyon
-
Publication number: 20230356692Abstract: Some embodiments provide a vehicle navigation system which can navigate a vehicle through an environment based on driving commands received from a remote control system based on manual operator interaction with an interface of the remote control system. Remote driving control can be engaged based on determination, via processing vehicle sensor data, of a health emergency associated with one or more occupants of the vehicle, and the remote control system can generate remote driving commands which cause the vehicle to be navigated to a particular location without requiring the occupant associated with the health emergency to manually navigate the vehicle. The remote control system can monitor the occupant via communicated vehicle sensor data and can control remote control devices included in the vehicle to provide external indication that the vehicle is being navigated according to remote driving control.Type: ApplicationFiled: July 10, 2023Publication date: November 9, 2023Applicant: Apple Inc.Inventors: Bartholomeus C. Nabbe, Tie-Qi Chen, Benjamin B. Lyon
-
Publication number: 20230271615Abstract: Determining an instantaneous vehicle characteristic (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined instantaneous vehicle characteristic of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined instantaneous vehicle characteristic of the additional vehicle. In many implementations, the instantaneous vehicle characteristics of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.Type: ApplicationFiled: April 11, 2023Publication date: August 31, 2023Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Patent number: 11654917Abstract: Determining yaw parameter(s) (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined yaw parameter(s) of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined yaw rate of the additional vehicle. In many implementations, the yaw parameter(s) of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.Type: GrantFiled: December 28, 2020Date of Patent: May 23, 2023Assignee: AURORA OPERATIONS, INC.Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Publication number: 20230133611Abstract: Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.Type: ApplicationFiled: December 30, 2022Publication date: May 4, 2023Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Patent number: 11550061Abstract: Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.Type: GrantFiled: October 29, 2018Date of Patent: January 10, 2023Assignee: Aurora Operations, Inc.Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Patent number: 11358601Abstract: Various implementations described herein generate training instances that each include corresponding training instance input that is based on corresponding sensor data of a corresponding autonomous vehicle, and that include corresponding training instance output that is based on corresponding sensor data of a corresponding additional vehicle, where the corresponding additional vehicle is captured at least in part by the corresponding sensor data of the corresponding autonomous vehicle. Various implementations train a machine learning model based on such training instances. Once trained, the machine learning model can enable processing, using the machine learning model, of sensor data from a given autonomous vehicle to predict one or more properties of a given additional vehicle that is captured at least in part by the sensor data.Type: GrantFiled: May 7, 2020Date of Patent: June 14, 2022Assignee: Aurora Operations, Inc.Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Publication number: 20210146932Abstract: Determining yaw parameter(s) (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined yaw parameter(s) of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined yaw rate of the additional vehicle. In many implementations, the yaw parameter(s) of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.Type: ApplicationFiled: December 28, 2020Publication date: May 20, 2021Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Patent number: 10906536Abstract: Determining yaw parameter(s) (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined yaw parameter(s) of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined yaw rate of the additional vehicle. In many implementations, the yaw parameter(s) of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.Type: GrantFiled: October 29, 2018Date of Patent: February 2, 2021Assignee: Aurora Innovation, Inc.Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Publication number: 20200391736Abstract: Various implementations described herein generate training instances that each include corresponding training instance input that is based on corresponding sensor data of a corresponding autonomous vehicle, and that include corresponding training instance output that is based on corresponding sensor data of a corresponding additional vehicle, where the corresponding additional vehicle is captured at least in part by the corresponding sensor data of the corresponding autonomous vehicle. Various implementations train a machine learning model based on such training instances. Once trained, the machine learning model can enable processing, using the machine learning model, of sensor data from a given autonomous vehicle to predict one or more properties of a given additional vehicle that is captured at least in part by the sensor data.Type: ApplicationFiled: May 7, 2020Publication date: December 17, 2020Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Patent number: 10739441Abstract: In some examples, a system comprises a laser light source and a rotatable mirror assembly comprising a plurality of mirror segments, the rotatable mirror assembly aligned to reflect light transmitted by the laser light source, wherein the plurality of mirror segments comprise a first segment that reflects a first light beam from the laser light source in a first direction, and a second mirror segment that reflects the first light beam from the laser light source in a second direction, different from the first direction. In some examples, the system comprises a light sensor positioned to receive light reflected from the rotatable mirror assembly. In some examples, the system comprises a motor for rotating the mirror assembly about a rotation axis. In some examples, the system comprises a controller for controlling a sampling phase of sampling the light sensor.Type: GrantFiled: September 29, 2017Date of Patent: August 11, 2020Assignee: FARADAY & FUTURE INC.Inventor: Bartholomeus C. Nabbe
-
Patent number: 10676085Abstract: Various implementations described herein generate training instances that each include corresponding training instance input that is based on corresponding sensor data of a corresponding autonomous vehicle, and that include corresponding training instance output that is based on corresponding sensor data of a corresponding additional vehicle, where the corresponding additional vehicle is captured at least in part by the corresponding sensor data of the corresponding autonomous vehicle. Various implementations train a machine learning model based on such training instances. Once trained, the machine learning model can enable processing, using the machine learning model, of sensor data from a given autonomous vehicle to predict one or more properties of a given additional vehicle that is captured at least in part by the sensor data.Type: GrantFiled: October 29, 2018Date of Patent: June 9, 2020Assignee: Aurora Innovation, Inc.Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Publication number: 20190317219Abstract: Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.Type: ApplicationFiled: October 29, 2018Publication date: October 17, 2019Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Publication number: 20190318206Abstract: Various implementations described herein generate training instances that each include corresponding training instance input that is based on corresponding sensor data of a corresponding autonomous vehicle, and that include corresponding training instance output that is based on corresponding sensor data of a corresponding additional vehicle, where the corresponding additional vehicle is captured at least in part by the corresponding sensor data of the corresponding autonomous vehicle. Various implementations train a machine learning model based on such training instances. Once trained, the machine learning model can enable processing, using the machine learning model, of sensor data from a given autonomous vehicle to predict one or more properties of a given additional vehicle that is captured at least in part by the sensor data.Type: ApplicationFiled: October 29, 2018Publication date: October 17, 2019Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Publication number: 20190315351Abstract: Determining yaw parameter(s) (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined yaw parameter(s) of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined yaw rate of the additional vehicle. In many implementations, the yaw parameter(s) of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.Type: ApplicationFiled: October 29, 2018Publication date: October 17, 2019Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
-
Publication number: 20190195992Abstract: In some examples, a system comprises a laser light source and a rotatable mirror assembly comprising a plurality of mirror segments, the rotatable mirror assembly aligned to reflect light transmitted by the laser light source, wherein the plurality of mirror segments comprise a first segment that reflects a first light beam from the laser light source in a first direction, and a second mirror segment that reflects the first light beam from the laser light source in a second direction, different from the first direction. In some examples, the system comprises a light sensor positioned to receive light reflected from the rotatable mirror assembly. In some examples, the system comprises a motor for rotating the mirror assembly about a rotation axis. In some examples, the system comprises a controller for controlling a sampling phase of sampling the light sensor.Type: ApplicationFiled: September 29, 2017Publication date: June 27, 2019Inventor: Bartholomeus C. Nabbe
-
Patent number: 10328897Abstract: Some embodiments provide a vehicle navigation system which can navigate a vehicle through an environment based on driving commands received from a remote control system based on manual operator interaction with an interface of the remote control system. Remote driving control can be engaged based on determination, via processing vehicle sensor data, of a health emergency associated with one or more occupants of the vehicle, and the remote control system can generate remote driving commands which cause the vehicle to be navigated to a particular location without requiring the occupant associated with the health emergency to manually navigate the vehicle. The remote control system can monitor the occupant via communicated vehicle sensor data and can control remote control devices included in the vehicle to provide external indication that the vehicle is being navigated according to remote driving control.Type: GrantFiled: September 23, 2016Date of Patent: June 25, 2019Assignee: Apple Inc.Inventors: Bartholomeus C. Nabbe, Tie-Qi Chen, Benjamin B. Lyon
-
Patent number: 10053001Abstract: Aspects of the present disclosure involve systems, methods, computer program products, and the like, for displaying an operational status of a system. In one particular implementation, the operational status of an autonomous vehicle is displayed. The operational status of the vehicle may indicate an operational state of the vehicle, such as whether the vehicle is in manual control mode or autonomous control mode. In addition to displaying the operational state, the vehicle may also display an intended or future maneuver by the vehicle. For example, the vehicle may determine a route for the autonomous vehicle that includes various operations or steps and may display one or more of the operations or steps of the route. This information may be displayed such that an external observer to the vehicle may determine the near-future operation autonomous vehicle is about to perform. Other information concerning the operation of the vehicle may also be displayed.Type: GrantFiled: September 15, 2016Date of Patent: August 21, 2018Assignee: Apple Inc.Inventors: Bartholomeus C. Nabbe, Byron B. Han