Patents by Inventor Bartholomeus C. Nabbe

Bartholomeus C. Nabbe has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11964663
    Abstract: Determining an instantaneous vehicle characteristic (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined instantaneous vehicle characteristic of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined instantaneous vehicle characteristic of the additional vehicle. In many implementations, the instantaneous vehicle characteristics of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Grant
    Filed: April 11, 2023
    Date of Patent: April 23, 2024
    Assignee: AURORA OPERATIONS, INC.
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Patent number: 11933902
    Abstract: Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Grant
    Filed: December 30, 2022
    Date of Patent: March 19, 2024
    Assignee: AURORA OPERATIONS, INC.
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Patent number: 11858459
    Abstract: Some embodiments provide a vehicle navigation system which can navigate a vehicle through an environment based on driving commands received from a remote control system based on manual operator interaction with an interface of the remote control system. Remote driving control can be engaged based on determination, via processing vehicle sensor data, of a health emergency associated with one or more occupants of the vehicle, and the remote control system can generate remote driving commands which cause the vehicle to be navigated to a particular location without requiring the occupant associated with the health emergency to manually navigate the vehicle. The remote control system can monitor the occupant via communicated vehicle sensor data and can control remote control devices included in the vehicle to provide external indication that the vehicle is being navigated according to remote driving control.
    Type: Grant
    Filed: May 17, 2019
    Date of Patent: January 2, 2024
    Assignee: Apple Inc.
    Inventors: Bartholomeus C. Nabbe, Tie-Qi Chen, Benjamin B. Lyon
  • Publication number: 20230356692
    Abstract: Some embodiments provide a vehicle navigation system which can navigate a vehicle through an environment based on driving commands received from a remote control system based on manual operator interaction with an interface of the remote control system. Remote driving control can be engaged based on determination, via processing vehicle sensor data, of a health emergency associated with one or more occupants of the vehicle, and the remote control system can generate remote driving commands which cause the vehicle to be navigated to a particular location without requiring the occupant associated with the health emergency to manually navigate the vehicle. The remote control system can monitor the occupant via communicated vehicle sensor data and can control remote control devices included in the vehicle to provide external indication that the vehicle is being navigated according to remote driving control.
    Type: Application
    Filed: July 10, 2023
    Publication date: November 9, 2023
    Applicant: Apple Inc.
    Inventors: Bartholomeus C. Nabbe, Tie-Qi Chen, Benjamin B. Lyon
  • Publication number: 20230271615
    Abstract: Determining an instantaneous vehicle characteristic (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined instantaneous vehicle characteristic of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined instantaneous vehicle characteristic of the additional vehicle. In many implementations, the instantaneous vehicle characteristics of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Application
    Filed: April 11, 2023
    Publication date: August 31, 2023
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Patent number: 11654917
    Abstract: Determining yaw parameter(s) (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined yaw parameter(s) of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined yaw rate of the additional vehicle. In many implementations, the yaw parameter(s) of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Grant
    Filed: December 28, 2020
    Date of Patent: May 23, 2023
    Assignee: AURORA OPERATIONS, INC.
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Publication number: 20230133611
    Abstract: Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Application
    Filed: December 30, 2022
    Publication date: May 4, 2023
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Patent number: 11550061
    Abstract: Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Grant
    Filed: October 29, 2018
    Date of Patent: January 10, 2023
    Assignee: Aurora Operations, Inc.
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Patent number: 11358601
    Abstract: Various implementations described herein generate training instances that each include corresponding training instance input that is based on corresponding sensor data of a corresponding autonomous vehicle, and that include corresponding training instance output that is based on corresponding sensor data of a corresponding additional vehicle, where the corresponding additional vehicle is captured at least in part by the corresponding sensor data of the corresponding autonomous vehicle. Various implementations train a machine learning model based on such training instances. Once trained, the machine learning model can enable processing, using the machine learning model, of sensor data from a given autonomous vehicle to predict one or more properties of a given additional vehicle that is captured at least in part by the sensor data.
    Type: Grant
    Filed: May 7, 2020
    Date of Patent: June 14, 2022
    Assignee: Aurora Operations, Inc.
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Publication number: 20210146932
    Abstract: Determining yaw parameter(s) (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined yaw parameter(s) of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined yaw rate of the additional vehicle. In many implementations, the yaw parameter(s) of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Application
    Filed: December 28, 2020
    Publication date: May 20, 2021
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Patent number: 10906536
    Abstract: Determining yaw parameter(s) (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined yaw parameter(s) of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined yaw rate of the additional vehicle. In many implementations, the yaw parameter(s) of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Grant
    Filed: October 29, 2018
    Date of Patent: February 2, 2021
    Assignee: Aurora Innovation, Inc.
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Publication number: 20200391736
    Abstract: Various implementations described herein generate training instances that each include corresponding training instance input that is based on corresponding sensor data of a corresponding autonomous vehicle, and that include corresponding training instance output that is based on corresponding sensor data of a corresponding additional vehicle, where the corresponding additional vehicle is captured at least in part by the corresponding sensor data of the corresponding autonomous vehicle. Various implementations train a machine learning model based on such training instances. Once trained, the machine learning model can enable processing, using the machine learning model, of sensor data from a given autonomous vehicle to predict one or more properties of a given additional vehicle that is captured at least in part by the sensor data.
    Type: Application
    Filed: May 7, 2020
    Publication date: December 17, 2020
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Patent number: 10739441
    Abstract: In some examples, a system comprises a laser light source and a rotatable mirror assembly comprising a plurality of mirror segments, the rotatable mirror assembly aligned to reflect light transmitted by the laser light source, wherein the plurality of mirror segments comprise a first segment that reflects a first light beam from the laser light source in a first direction, and a second mirror segment that reflects the first light beam from the laser light source in a second direction, different from the first direction. In some examples, the system comprises a light sensor positioned to receive light reflected from the rotatable mirror assembly. In some examples, the system comprises a motor for rotating the mirror assembly about a rotation axis. In some examples, the system comprises a controller for controlling a sampling phase of sampling the light sensor.
    Type: Grant
    Filed: September 29, 2017
    Date of Patent: August 11, 2020
    Assignee: FARADAY & FUTURE INC.
    Inventor: Bartholomeus C. Nabbe
  • Patent number: 10676085
    Abstract: Various implementations described herein generate training instances that each include corresponding training instance input that is based on corresponding sensor data of a corresponding autonomous vehicle, and that include corresponding training instance output that is based on corresponding sensor data of a corresponding additional vehicle, where the corresponding additional vehicle is captured at least in part by the corresponding sensor data of the corresponding autonomous vehicle. Various implementations train a machine learning model based on such training instances. Once trained, the machine learning model can enable processing, using the machine learning model, of sensor data from a given autonomous vehicle to predict one or more properties of a given additional vehicle that is captured at least in part by the sensor data.
    Type: Grant
    Filed: October 29, 2018
    Date of Patent: June 9, 2020
    Assignee: Aurora Innovation, Inc.
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Publication number: 20190317219
    Abstract: Determining classification(s) for object(s) in an environment of autonomous vehicle, and controlling the vehicle based on the determined classification(s). For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be controlled based on determined pose(s) and/or classification(s) for objects in the environment. The control can be based on the pose(s) and/or classification(s) directly, and/or based on movement parameter(s), for the object(s), determined based on the pose(s) and/or classification(s). In many implementations, pose(s) and/or classification(s) of environmental object(s) are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Application
    Filed: October 29, 2018
    Publication date: October 17, 2019
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Publication number: 20190318206
    Abstract: Various implementations described herein generate training instances that each include corresponding training instance input that is based on corresponding sensor data of a corresponding autonomous vehicle, and that include corresponding training instance output that is based on corresponding sensor data of a corresponding additional vehicle, where the corresponding additional vehicle is captured at least in part by the corresponding sensor data of the corresponding autonomous vehicle. Various implementations train a machine learning model based on such training instances. Once trained, the machine learning model can enable processing, using the machine learning model, of sensor data from a given autonomous vehicle to predict one or more properties of a given additional vehicle that is captured at least in part by the sensor data.
    Type: Application
    Filed: October 29, 2018
    Publication date: October 17, 2019
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Publication number: 20190315351
    Abstract: Determining yaw parameter(s) (e.g., at least one yaw rate) of an additional vehicle that is in addition to a vehicle being autonomously controlled, and adapting autonomous control of the vehicle based on the determined yaw parameter(s) of the additional vehicle. For example, autonomous steering, acceleration, and/or deceleration of the vehicle can be adapted based on a determined yaw rate of the additional vehicle. In many implementations, the yaw parameter(s) of the additional vehicle are determined based on data from a phase coherent Light Detection and Ranging (LIDAR) component of the vehicle, such as a phase coherent LIDAR monopulse component and/or a frequency-modulated continuous wave (FMCW) LIDAR component.
    Type: Application
    Filed: October 29, 2018
    Publication date: October 17, 2019
    Inventors: Warren Smith, Ethan Eade, Sterling J. Anderson, James Andrew Bagnell, Bartholomeus C. Nabbe, Christopher Paul Urmson
  • Publication number: 20190195992
    Abstract: In some examples, a system comprises a laser light source and a rotatable mirror assembly comprising a plurality of mirror segments, the rotatable mirror assembly aligned to reflect light transmitted by the laser light source, wherein the plurality of mirror segments comprise a first segment that reflects a first light beam from the laser light source in a first direction, and a second mirror segment that reflects the first light beam from the laser light source in a second direction, different from the first direction. In some examples, the system comprises a light sensor positioned to receive light reflected from the rotatable mirror assembly. In some examples, the system comprises a motor for rotating the mirror assembly about a rotation axis. In some examples, the system comprises a controller for controlling a sampling phase of sampling the light sensor.
    Type: Application
    Filed: September 29, 2017
    Publication date: June 27, 2019
    Inventor: Bartholomeus C. Nabbe
  • Patent number: 10328897
    Abstract: Some embodiments provide a vehicle navigation system which can navigate a vehicle through an environment based on driving commands received from a remote control system based on manual operator interaction with an interface of the remote control system. Remote driving control can be engaged based on determination, via processing vehicle sensor data, of a health emergency associated with one or more occupants of the vehicle, and the remote control system can generate remote driving commands which cause the vehicle to be navigated to a particular location without requiring the occupant associated with the health emergency to manually navigate the vehicle. The remote control system can monitor the occupant via communicated vehicle sensor data and can control remote control devices included in the vehicle to provide external indication that the vehicle is being navigated according to remote driving control.
    Type: Grant
    Filed: September 23, 2016
    Date of Patent: June 25, 2019
    Assignee: Apple Inc.
    Inventors: Bartholomeus C. Nabbe, Tie-Qi Chen, Benjamin B. Lyon
  • Patent number: 10053001
    Abstract: Aspects of the present disclosure involve systems, methods, computer program products, and the like, for displaying an operational status of a system. In one particular implementation, the operational status of an autonomous vehicle is displayed. The operational status of the vehicle may indicate an operational state of the vehicle, such as whether the vehicle is in manual control mode or autonomous control mode. In addition to displaying the operational state, the vehicle may also display an intended or future maneuver by the vehicle. For example, the vehicle may determine a route for the autonomous vehicle that includes various operations or steps and may display one or more of the operations or steps of the route. This information may be displayed such that an external observer to the vehicle may determine the near-future operation autonomous vehicle is about to perform. Other information concerning the operation of the vehicle may also be displayed.
    Type: Grant
    Filed: September 15, 2016
    Date of Patent: August 21, 2018
    Assignee: Apple Inc.
    Inventors: Bartholomeus C. Nabbe, Byron B. Han