Patents by Inventor Eric C. Danziger

Eric C. Danziger has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11435479
    Abstract: A computer-implemented method of determining relative velocity between a vehicle and an object. The method includes receiving sensor data generated by one or more sensors of the vehicle. The one or more sensors are configured to sense an environment through which the vehicle is moving by following a scan pattern comprising component scan lines. The method includes obtaining, by one or more processors, a point cloud frame based on the sensor data and representative of the environment and identifying, by the one or more processors, a point cloud object within the point cloud frame. The method further includes determining, by the one or more processors, that the point cloud object is skewed relative to an expected configuration of the point cloud object, and determining, by the one or more processors, a relative velocity of the point cloud object by analyzing the skew of the object.
    Type: Grant
    Filed: November 20, 2018
    Date of Patent: September 6, 2022
    Assignee: Luminar, LLC
    Inventors: Eric C. Danziger, Austin K. Russell, Benjamin Englard
  • Patent number: 11415676
    Abstract: In one embodiment, a lidar system includes a light source configured to emit pulses of light and a scanner configured to scan at least a portion of the emitted pulses of light along an interlaced scan pattern. The scanner includes a first scanning mirror configured to scan the portion of the emitted pulses of light substantially parallel to a first scan axis to produce multiple scan lines of the interlaced scan pattern, where each scan line is oriented substantially parallel to the first scan axis. The scanner also includes a second scanning mirror configured to distribute the scan lines along a second scan axis that is substantially orthogonal to the first scan axis, where the scan lines are distributed in an interlaced manner.
    Type: Grant
    Filed: October 9, 2018
    Date of Patent: August 16, 2022
    Assignee: Luminar, LLC
    Inventor: Eric C. Danziger
  • Patent number: 10984257
    Abstract: A method for controlling a vehicle based on sensor data having variable sensor parameter settings includes receiving sensor data generated by a vehicle sensor while the sensor is configured with a first sensor parameter setting. The method also includes receiving an indicator specifying the first sensor parameter setting, and selecting, based on the received indicator, one of a plurality of neural networks of a perception component, each neural network having been trained using training data corresponding to a different sensor parameter setting. The method also includes generating signals descriptive of a current state of the environment using the selected neural network and based on the received sensor data. The method further includes generating driving decisions based on the signals descriptive of the current state of the environment, and causing one or more operational subsystems of the vehicle to maneuver the vehicle in accordance with the generated driving decisions.
    Type: Grant
    Filed: October 31, 2018
    Date of Patent: April 20, 2021
    Assignee: Luminar Holdco, LLC
    Inventors: Benjamin Englard, Eric C. Danziger
  • Patent number: 10768304
    Abstract: A method for processing point clouds having variable spatial distributions of scan lines includes receiving a point cloud frame generated by a sensor configured to sense an environment through which a vehicle is moving. The point cloud frame includes scan lines arranged according to a particular spatial distribution. The method also includes either generating an enhanced point cloud frame with a larger number of points than the received point cloud frame, or constructing, by one or more processors and based on points of the received point cloud frame, a three-dimensional mesh. The method also includes generating, by performing an interpolation function on the enhanced point cloud frame or a virtual surface provided by the three-dimensional mesh, a normalized point cloud frame, and generating, using the normalized point cloud frame, signals descriptive of a current state of the environment through which the vehicle is moving.
    Type: Grant
    Filed: October 31, 2018
    Date of Patent: September 8, 2020
    Assignee: Luminar Technologies, Inc.
    Inventors: Benjamin Englard, Eric C. Danziger, Austin K. Russell
  • Patent number: 10754037
    Abstract: A method for processing point clouds having variable spatial distributions of scan lines includes receiving a point cloud portion corresponding to an object in a vehicle environment, the point cloud portion including scan lines arranged according to a particular spatial distribution. The method also includes constructing a voxel grid corresponding to the received point cloud portion. The voxel grid includes a plurality of volumes in a stacked, three-dimensional arrangement, and constructing the voxel grid includes (i) determining an initial classification of the object, (ii) setting one or more parameters of the voxel grid based on the initial classification, and (iii) associating each volume of the plurality of volumes with an attribute specifying how many points, from the point cloud portion, fall within that volume. The method also includes generating, using the constructed voxel grid, signals descriptive of a current state of the environment through which the vehicle is moving.
    Type: Grant
    Filed: October 31, 2018
    Date of Patent: August 25, 2020
    Assignee: Luminar Technologies, Inc.
    Inventors: Benjamin Englard, Eric C. Danziger
  • Patent number: 10677900
    Abstract: A computer-implemented method of detecting object distortion. The method includes receiving sensor data generated by one or more sensors of the vehicle. The one or more sensors are configured to sense an environment through which the vehicle is moving by following a scan pattern. The method also includes obtaining, based on the sensor data, a point cloud frame representative of the environment and identifying a point cloud object within the point cloud frame. Additionally, the method includes analyzing the point cloud object to identify a feature of the point cloud object that has an expected shape and comparing the feature of the point cloud object to the expected shape. The method also includes identifying that the point cloud object is distorted based on the feature of the point cloud object not matching the expected shape.
    Type: Grant
    Filed: November 20, 2018
    Date of Patent: June 9, 2020
    Assignee: Luminar Technologies, Inc.
    Inventors: Austin K. Russell, Eric C. Danziger
  • Patent number: 10627521
    Abstract: A method for controlling at least a first vehicle sensor includes receiving sensor data generated by one or more vehicle sensors that are configured to sense an environment through which the vehicle is moving, and identifying, based on the received sensor data, one or more current and/or predicted positions of one or more dynamic objects that are currently moving, or are capable of movement, within the environment. The method also includes causing, based on the current and/or predicted positions of the dynamic objects, an area of focus of the first sensor to be adjusted, at least by causing (i) a field of regard of the first sensor, and/or (ii) a spatial distribution of scan lines produced by the first sensor, to be adjusted.
    Type: Grant
    Filed: October 31, 2018
    Date of Patent: April 21, 2020
    Assignee: Luminar Technologies, Inc.
    Inventors: Benjamin Englard, Eric C. Danziger, Austin K. Russell
  • Publication number: 20200041647
    Abstract: A computer-implemented method of determining relative velocity between a vehicle and an object. The method includes receiving sensor data generated by one or more sensors of the vehicle. The one or more sensors are configured to sense an environment through which the vehicle is moving by following a scan pattern comprising component scan lines. The method also includes obtaining, based on the sensor data and by one or more processors, two or more point cloud frames representative of the environment and tracking, by the one or more processors, a point cloud object across the two or more point cloud frames. Additionally, the method includes determining, based on the tracking and by the one or more processors, a relative velocity of the point cloud object and correcting, by the one or more processors, the point cloud object based on the relative velocity of the point cloud object.
    Type: Application
    Filed: November 20, 2018
    Publication date: February 6, 2020
    Inventors: Eric C. Danziger, Austin K. Russell
  • Publication number: 20200043146
    Abstract: A computer-implemented method of detecting object distortion. The method includes receiving sensor data generated by one or more sensors of the vehicle. The one or more sensors are configured to sense an environment through which the vehicle is moving by following a scan pattern. The method also includes obtaining, based on the sensor data, a point cloud frame representative of the environment and identifying a point cloud object within the point cloud frame. Additionally, the method includes analyzing the point cloud object to identify a feature of the point cloud object that has an expected shape and comparing the feature of the point cloud object to the expected shape. The method also includes identifying that the point cloud object is distorted based on the feature of the point cloud object not matching the expected shape.
    Type: Application
    Filed: November 20, 2018
    Publication date: February 6, 2020
    Inventors: Austin K. Russell, Eric C. Danziger
  • Publication number: 20200041648
    Abstract: A computer-implemented method of determining relative velocity between a vehicle and an object. The method includes receiving sensor data generated by one or more sensors of the vehicle. The one or more sensors are configured to sense an environment through which the vehicle is moving by following a scan pattern comprising component scan lines. The method includes obtaining, by one or more processors, a point cloud frame based on the sensor data and representative of the environment and identifying, by the one or more processors, a point cloud object within the point cloud frame. The method further includes determining, by the one or more processors, that the point cloud object is skewed relative to an expected configuration of the point cloud object, and determining, by the one or more processors, a relative velocity of the point cloud object by analyzing the skew of the object.
    Type: Application
    Filed: November 20, 2018
    Publication date: February 6, 2020
    Inventors: Eric C. Danziger, Austin K. Russell, Benjamin Englard
  • Patent number: 10539665
    Abstract: A computer-implemented method of determining relative velocity between a vehicle and an object. The method includes receiving sensor data generated by one or more sensors of the vehicle. The one or more sensors are configured to sense an environment through which the vehicle is moving by following a scan pattern comprising component scan lines. The method also includes obtaining, based on the sensor data and by one or more processors, two or more point cloud frames representative of the environment and tracking, by the one or more processors, a point cloud object across the two or more point cloud frames. Additionally, the method includes determining, based on the tracking and by the one or more processors, a relative velocity of the point cloud object and correcting, by the one or more processors, the point cloud object based on the relative velocity of the point cloud object.
    Type: Grant
    Filed: November 20, 2018
    Date of Patent: January 21, 2020
    Assignee: Luminar Technologies, Inc.
    Inventors: Eric C. Danziger, Austin K. Russell
  • Patent number: 10514462
    Abstract: A method for configuring a perception component of a vehicle having one or more sensors includes generating a first set of training data that includes first sensor data corresponding to a first setting of one or more sensor parameters, and an indicator of the first setting. The method also includes generating a second set of training data that includes second sensor data corresponding to a second setting of the sensor parameter(s), and an indicator of the second setting. The method further includes training the perception component, at least by training a machine learning based model using the first and second training data sets. The trained perception component is configured to generate signals descriptive of a current state of the vehicle environment by processing sensor data generated by the sensor(s), and one or more indicators indicating which setting of the sensor parameter(s) corresponds to which portions of the generated sensor data.
    Type: Grant
    Filed: October 31, 2018
    Date of Patent: December 24, 2019
    Assignee: Luminar Technologies, Inc.
    Inventors: Benjamin Englard, Eric C. Danziger
  • Patent number: 10509127
    Abstract: A method for controlling a first sensor configured to sense an environment through which a vehicle is moving includes receiving sensor data generated by one or more sensors of the vehicle as the vehicle moves through the environment, identifying, by one or more processors and based on at least a portion of the received sensor data, one or more road portions along which the vehicle is expected to travel, and determining, by one or more processors, a configuration of the identified road portions, at least in part by determining a slope of at least one of the identified road portions. The method also includes determining, by one or more processors analyzing at least the determined configuration, an elevation of a field of regard of the first sensor that satisfies one or more visibility criteria, and causing the first sensor to be adjusted in accordance with the determined elevation.
    Type: Grant
    Filed: October 31, 2018
    Date of Patent: December 17, 2019
    Assignee: Luminar Technologies, Inc.
    Inventors: Benjamin Englard, Eric C. Danziger, Joseph Augenbraun, Austin K. Russell
  • Patent number: 10495739
    Abstract: A computer-implemented method of determining relative velocity between a vehicle and an object. The method includes receiving sensor data generated by one or more sensors of the vehicle. The one or more sensors are configured to sense an environment through which the vehicle is moving by following a scan pattern comprising component scan lines. The method also includes obtaining, based on the sensor data and by one or more processors, two or more point cloud frames representative of the environment and tracking, by the one or more processors, a point cloud object across the two or more point cloud frames. Additionally, the method includes determining, based on the tracking and by the one or more processors, a relative velocity of the point cloud object and correcting, by the one or more processors, the point cloud object based on the relative velocity of the point cloud object.
    Type: Grant
    Filed: November 20, 2018
    Date of Patent: December 3, 2019
    Assignee: Luminar Technologies, Inc.
    Inventors: Eric C. Danziger, Austin K. Russell
  • Patent number: 10473788
    Abstract: A method for controlling at least a first sensor of a vehicle, which senses an environment through which the vehicle is moving by producing a plurality of scan lines arranged according to a spatial distribution, includes receiving sensor data generated by one or more sensors. The one or more sensors are configured to sense the environment through which the vehicle is moving. The method also includes identifying, by one or more processors and based on the received sensor data, one or more areas of interest in the environment, and causing, by one or more processors and based on the areas of interest, the spatial distribution of the plurality of scan lines produced by the first sensor to be adjusted.
    Type: Grant
    Filed: October 31, 2018
    Date of Patent: November 12, 2019
    Assignee: Luminar Technologies, Inc.
    Inventors: Benjamin Englard, Eric C. Danziger, Austin K. Russell
  • Patent number: 10451739
    Abstract: A computer-implemented method of determining relative velocity between a vehicle and an object. The method includes receiving sensor data generated by one or more sensors of the vehicle. The one or more sensors are configured to sense an environment through which the vehicle is moving by following a scan pattern comprising component scan lines. The method also includes obtaining, based on the sensor data and by one or more processors, two or more point cloud frames representative of the environment and tracking, by the one or more processors, a point cloud object across the two or more point cloud frames. Additionally, the method includes determining, based on the tracking and by the one or more processors, a relative velocity of the point cloud object and correcting, by the one or more processors, the point cloud object based on the relative velocity of the point cloud object.
    Type: Grant
    Filed: November 20, 2018
    Date of Patent: October 22, 2019
    Assignee: Luminar Technologies, Inc.
    Inventors: Eric C. Danziger, Austin K. Russell
  • Patent number: 10338223
    Abstract: A method for processing point clouds having variable spatial distributions of scan lines includes receiving a point cloud frame generated by a sensor configured to sense a vehicle environment. Each of the points in the frame has associated two-dimensional coordinates and an associated parameter value. The method also includes generating a normalized point cloud frame by adding interpolated points not present in the received frame, at least by, for each interpolated point, identifying one or more neighboring points having associated two-dimensional coordinates that are within a threshold distance of two-dimensional coordinates for the interpolated point, and calculating an estimated parameter value of the interpolated point using, for each of the identified neighboring points, a distance between the two-dimensional coordinates and the parameter value associated with the identified neighboring point.
    Type: Grant
    Filed: October 31, 2018
    Date of Patent: July 2, 2019
    Assignee: Luminar Technologies, Inc.
    Inventors: Benjamin Englard, Eric C. Danziger
  • Publication number: 20190176841
    Abstract: A method for controlling a vehicle based on sensor data having variable sensor parameter settings includes receiving sensor data generated by a vehicle sensor while the sensor is configured with a first sensor parameter setting. The method also includes receiving an indicator specifying the first sensor parameter setting, and selecting, based on the received indicator, one of a plurality of neural networks of a perception component, each neural network having been trained using training data corresponding to a different sensor parameter setting. The method also includes generating signals descriptive of a current state of the environment using the selected neural network and based on the received sensor data. The method further includes generating driving decisions based on the signals descriptive of the current state of the environment, and causing one or more operational subsystems of the vehicle to maneuver the vehicle in accordance with the generated driving decisions.
    Type: Application
    Filed: October 31, 2018
    Publication date: June 13, 2019
    Inventors: Benjamin Englard, Eric C. Danziger
  • Publication number: 20190179024
    Abstract: A method for processing point clouds having variable spatial distributions of scan lines includes receiving a point cloud portion corresponding to an object in a vehicle environment, the point cloud portion including scan lines arranged according to a particular spatial distribution. The method also includes constructing a voxel grid corresponding to the received point cloud portion. The voxel grid includes a plurality of volumes in a stacked, three-dimensional arrangement, and constructing the voxel grid includes (i) determining an initial classification of the object, (ii) setting one or more parameters of the voxel grid based on the initial classification, and (iii) associating each volume of the plurality of volumes with an attribute specifying how many points, from the point cloud portion, fall within that volume. The method also includes generating, using the constructed voxel grid, signals descriptive of a current state of the environment through which the vehicle is moving.
    Type: Application
    Filed: October 31, 2018
    Publication date: June 13, 2019
    Inventors: Benjamin Englard, Eric C. Danziger
  • Publication number: 20190179026
    Abstract: A method for controlling at least a first sensor of a vehicle, which senses an environment through which the vehicle is moving by producing a plurality of scan lines arranged according to a spatial distribution, includes receiving sensor data generated by one or more sensors. The one or more sensors are configured to sense the environment through which the vehicle is moving. The method also includes identifying, by one or more processors and based on the received sensor data, one or more areas of interest in the environment, and causing, by one or more processors and based on the areas of interest, the spatial distribution of the plurality of scan lines produced by the first sensor to be adjusted.
    Type: Application
    Filed: October 31, 2018
    Publication date: June 13, 2019
    Inventors: Benjamin Englard, Eric C. Danziger, Austin K. Russell