Patents by Inventor Eric C. Danziger
Eric C. Danziger has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20220390573Abstract: In one embodiment, a lidar system includes a light source configured to emit pulses of light. The lidar system also includes a scanner configured to scan at least a portion of the emitted pulses of light along an interlaced scan pattern, including: (i) scanning the portion of the emitted pulses of light substantially parallel to a first scan axis to produce multiple scan lines of the interlaced scan pattern; and (ii) distributing the scan lines along a second scan axis in an interlaced manner, where the interlaced scan pattern is an n-fold interlaced scan pattern that includes n sub-scans, where: n is an integer greater than or equal to 2, each sub-scan includes two or more of the scan lines of the interlaced scan pattern, and the n sub-scans are scanned sequentially where a first sub-scan of the n sub-scans is scanned prior to a second sub-scan.Type: ApplicationFiled: August 12, 2022Publication date: December 8, 2022Inventor: Eric C. Danziger
-
Patent number: 11435479Abstract: A computer-implemented method of determining relative velocity between a vehicle and an object. The method includes receiving sensor data generated by one or more sensors of the vehicle. The one or more sensors are configured to sense an environment through which the vehicle is moving by following a scan pattern comprising component scan lines. The method includes obtaining, by one or more processors, a point cloud frame based on the sensor data and representative of the environment and identifying, by the one or more processors, a point cloud object within the point cloud frame. The method further includes determining, by the one or more processors, that the point cloud object is skewed relative to an expected configuration of the point cloud object, and determining, by the one or more processors, a relative velocity of the point cloud object by analyzing the skew of the object.Type: GrantFiled: November 20, 2018Date of Patent: September 6, 2022Assignee: Luminar, LLCInventors: Eric C. Danziger, Austin K. Russell, Benjamin Englard
-
Patent number: 11415676Abstract: In one embodiment, a lidar system includes a light source configured to emit pulses of light and a scanner configured to scan at least a portion of the emitted pulses of light along an interlaced scan pattern. The scanner includes a first scanning mirror configured to scan the portion of the emitted pulses of light substantially parallel to a first scan axis to produce multiple scan lines of the interlaced scan pattern, where each scan line is oriented substantially parallel to the first scan axis. The scanner also includes a second scanning mirror configured to distribute the scan lines along a second scan axis that is substantially orthogonal to the first scan axis, where the scan lines are distributed in an interlaced manner.Type: GrantFiled: October 9, 2018Date of Patent: August 16, 2022Assignee: Luminar, LLCInventor: Eric C. Danziger
-
Patent number: 10984257Abstract: A method for controlling a vehicle based on sensor data having variable sensor parameter settings includes receiving sensor data generated by a vehicle sensor while the sensor is configured with a first sensor parameter setting. The method also includes receiving an indicator specifying the first sensor parameter setting, and selecting, based on the received indicator, one of a plurality of neural networks of a perception component, each neural network having been trained using training data corresponding to a different sensor parameter setting. The method also includes generating signals descriptive of a current state of the environment using the selected neural network and based on the received sensor data. The method further includes generating driving decisions based on the signals descriptive of the current state of the environment, and causing one or more operational subsystems of the vehicle to maneuver the vehicle in accordance with the generated driving decisions.Type: GrantFiled: October 31, 2018Date of Patent: April 20, 2021Assignee: Luminar Holdco, LLCInventors: Benjamin Englard, Eric C. Danziger
-
Patent number: 10768304Abstract: A method for processing point clouds having variable spatial distributions of scan lines includes receiving a point cloud frame generated by a sensor configured to sense an environment through which a vehicle is moving. The point cloud frame includes scan lines arranged according to a particular spatial distribution. The method also includes either generating an enhanced point cloud frame with a larger number of points than the received point cloud frame, or constructing, by one or more processors and based on points of the received point cloud frame, a three-dimensional mesh. The method also includes generating, by performing an interpolation function on the enhanced point cloud frame or a virtual surface provided by the three-dimensional mesh, a normalized point cloud frame, and generating, using the normalized point cloud frame, signals descriptive of a current state of the environment through which the vehicle is moving.Type: GrantFiled: October 31, 2018Date of Patent: September 8, 2020Assignee: Luminar Technologies, Inc.Inventors: Benjamin Englard, Eric C. Danziger, Austin K. Russell
-
Processing point clouds of vehicle sensors having variable scan line distributions using voxel grids
Patent number: 10754037Abstract: A method for processing point clouds having variable spatial distributions of scan lines includes receiving a point cloud portion corresponding to an object in a vehicle environment, the point cloud portion including scan lines arranged according to a particular spatial distribution. The method also includes constructing a voxel grid corresponding to the received point cloud portion. The voxel grid includes a plurality of volumes in a stacked, three-dimensional arrangement, and constructing the voxel grid includes (i) determining an initial classification of the object, (ii) setting one or more parameters of the voxel grid based on the initial classification, and (iii) associating each volume of the plurality of volumes with an attribute specifying how many points, from the point cloud portion, fall within that volume. The method also includes generating, using the constructed voxel grid, signals descriptive of a current state of the environment through which the vehicle is moving.Type: GrantFiled: October 31, 2018Date of Patent: August 25, 2020Assignee: Luminar Technologies, Inc.Inventors: Benjamin Englard, Eric C. Danziger -
Patent number: 10677900Abstract: A computer-implemented method of detecting object distortion. The method includes receiving sensor data generated by one or more sensors of the vehicle. The one or more sensors are configured to sense an environment through which the vehicle is moving by following a scan pattern. The method also includes obtaining, based on the sensor data, a point cloud frame representative of the environment and identifying a point cloud object within the point cloud frame. Additionally, the method includes analyzing the point cloud object to identify a feature of the point cloud object that has an expected shape and comparing the feature of the point cloud object to the expected shape. The method also includes identifying that the point cloud object is distorted based on the feature of the point cloud object not matching the expected shape.Type: GrantFiled: November 20, 2018Date of Patent: June 9, 2020Assignee: Luminar Technologies, Inc.Inventors: Austin K. Russell, Eric C. Danziger
-
Patent number: 10627521Abstract: A method for controlling at least a first vehicle sensor includes receiving sensor data generated by one or more vehicle sensors that are configured to sense an environment through which the vehicle is moving, and identifying, based on the received sensor data, one or more current and/or predicted positions of one or more dynamic objects that are currently moving, or are capable of movement, within the environment. The method also includes causing, based on the current and/or predicted positions of the dynamic objects, an area of focus of the first sensor to be adjusted, at least by causing (i) a field of regard of the first sensor, and/or (ii) a spatial distribution of scan lines produced by the first sensor, to be adjusted.Type: GrantFiled: October 31, 2018Date of Patent: April 21, 2020Assignee: Luminar Technologies, Inc.Inventors: Benjamin Englard, Eric C. Danziger, Austin K. Russell
-
Publication number: 20200043146Abstract: A computer-implemented method of detecting object distortion. The method includes receiving sensor data generated by one or more sensors of the vehicle. The one or more sensors are configured to sense an environment through which the vehicle is moving by following a scan pattern. The method also includes obtaining, based on the sensor data, a point cloud frame representative of the environment and identifying a point cloud object within the point cloud frame. Additionally, the method includes analyzing the point cloud object to identify a feature of the point cloud object that has an expected shape and comparing the feature of the point cloud object to the expected shape. The method also includes identifying that the point cloud object is distorted based on the feature of the point cloud object not matching the expected shape.Type: ApplicationFiled: November 20, 2018Publication date: February 6, 2020Inventors: Austin K. Russell, Eric C. Danziger
-
Publication number: 20200041648Abstract: A computer-implemented method of determining relative velocity between a vehicle and an object. The method includes receiving sensor data generated by one or more sensors of the vehicle. The one or more sensors are configured to sense an environment through which the vehicle is moving by following a scan pattern comprising component scan lines. The method includes obtaining, by one or more processors, a point cloud frame based on the sensor data and representative of the environment and identifying, by the one or more processors, a point cloud object within the point cloud frame. The method further includes determining, by the one or more processors, that the point cloud object is skewed relative to an expected configuration of the point cloud object, and determining, by the one or more processors, a relative velocity of the point cloud object by analyzing the skew of the object.Type: ApplicationFiled: November 20, 2018Publication date: February 6, 2020Inventors: Eric C. Danziger, Austin K. Russell, Benjamin Englard
-
Publication number: 20200041647Abstract: A computer-implemented method of determining relative velocity between a vehicle and an object. The method includes receiving sensor data generated by one or more sensors of the vehicle. The one or more sensors are configured to sense an environment through which the vehicle is moving by following a scan pattern comprising component scan lines. The method also includes obtaining, based on the sensor data and by one or more processors, two or more point cloud frames representative of the environment and tracking, by the one or more processors, a point cloud object across the two or more point cloud frames. Additionally, the method includes determining, based on the tracking and by the one or more processors, a relative velocity of the point cloud object and correcting, by the one or more processors, the point cloud object based on the relative velocity of the point cloud object.Type: ApplicationFiled: November 20, 2018Publication date: February 6, 2020Inventors: Eric C. Danziger, Austin K. Russell
-
Patent number: 10539665Abstract: A computer-implemented method of determining relative velocity between a vehicle and an object. The method includes receiving sensor data generated by one or more sensors of the vehicle. The one or more sensors are configured to sense an environment through which the vehicle is moving by following a scan pattern comprising component scan lines. The method also includes obtaining, based on the sensor data and by one or more processors, two or more point cloud frames representative of the environment and tracking, by the one or more processors, a point cloud object across the two or more point cloud frames. Additionally, the method includes determining, based on the tracking and by the one or more processors, a relative velocity of the point cloud object and correcting, by the one or more processors, the point cloud object based on the relative velocity of the point cloud object.Type: GrantFiled: November 20, 2018Date of Patent: January 21, 2020Assignee: Luminar Technologies, Inc.Inventors: Eric C. Danziger, Austin K. Russell
-
Patent number: 10514462Abstract: A method for configuring a perception component of a vehicle having one or more sensors includes generating a first set of training data that includes first sensor data corresponding to a first setting of one or more sensor parameters, and an indicator of the first setting. The method also includes generating a second set of training data that includes second sensor data corresponding to a second setting of the sensor parameter(s), and an indicator of the second setting. The method further includes training the perception component, at least by training a machine learning based model using the first and second training data sets. The trained perception component is configured to generate signals descriptive of a current state of the vehicle environment by processing sensor data generated by the sensor(s), and one or more indicators indicating which setting of the sensor parameter(s) corresponds to which portions of the generated sensor data.Type: GrantFiled: October 31, 2018Date of Patent: December 24, 2019Assignee: Luminar Technologies, Inc.Inventors: Benjamin Englard, Eric C. Danziger
-
Patent number: 10509127Abstract: A method for controlling a first sensor configured to sense an environment through which a vehicle is moving includes receiving sensor data generated by one or more sensors of the vehicle as the vehicle moves through the environment, identifying, by one or more processors and based on at least a portion of the received sensor data, one or more road portions along which the vehicle is expected to travel, and determining, by one or more processors, a configuration of the identified road portions, at least in part by determining a slope of at least one of the identified road portions. The method also includes determining, by one or more processors analyzing at least the determined configuration, an elevation of a field of regard of the first sensor that satisfies one or more visibility criteria, and causing the first sensor to be adjusted in accordance with the determined elevation.Type: GrantFiled: October 31, 2018Date of Patent: December 17, 2019Assignee: Luminar Technologies, Inc.Inventors: Benjamin Englard, Eric C. Danziger, Joseph Augenbraun, Austin K. Russell
-
Patent number: 10495739Abstract: A computer-implemented method of determining relative velocity between a vehicle and an object. The method includes receiving sensor data generated by one or more sensors of the vehicle. The one or more sensors are configured to sense an environment through which the vehicle is moving by following a scan pattern comprising component scan lines. The method also includes obtaining, based on the sensor data and by one or more processors, two or more point cloud frames representative of the environment and tracking, by the one or more processors, a point cloud object across the two or more point cloud frames. Additionally, the method includes determining, based on the tracking and by the one or more processors, a relative velocity of the point cloud object and correcting, by the one or more processors, the point cloud object based on the relative velocity of the point cloud object.Type: GrantFiled: November 20, 2018Date of Patent: December 3, 2019Assignee: Luminar Technologies, Inc.Inventors: Eric C. Danziger, Austin K. Russell
-
Patent number: 10473788Abstract: A method for controlling at least a first sensor of a vehicle, which senses an environment through which the vehicle is moving by producing a plurality of scan lines arranged according to a spatial distribution, includes receiving sensor data generated by one or more sensors. The one or more sensors are configured to sense the environment through which the vehicle is moving. The method also includes identifying, by one or more processors and based on the received sensor data, one or more areas of interest in the environment, and causing, by one or more processors and based on the areas of interest, the spatial distribution of the plurality of scan lines produced by the first sensor to be adjusted.Type: GrantFiled: October 31, 2018Date of Patent: November 12, 2019Assignee: Luminar Technologies, Inc.Inventors: Benjamin Englard, Eric C. Danziger, Austin K. Russell
-
Patent number: 10451739Abstract: A computer-implemented method of determining relative velocity between a vehicle and an object. The method includes receiving sensor data generated by one or more sensors of the vehicle. The one or more sensors are configured to sense an environment through which the vehicle is moving by following a scan pattern comprising component scan lines. The method also includes obtaining, based on the sensor data and by one or more processors, two or more point cloud frames representative of the environment and tracking, by the one or more processors, a point cloud object across the two or more point cloud frames. Additionally, the method includes determining, based on the tracking and by the one or more processors, a relative velocity of the point cloud object and correcting, by the one or more processors, the point cloud object based on the relative velocity of the point cloud object.Type: GrantFiled: November 20, 2018Date of Patent: October 22, 2019Assignee: Luminar Technologies, Inc.Inventors: Eric C. Danziger, Austin K. Russell
-
Patent number: 10338223Abstract: A method for processing point clouds having variable spatial distributions of scan lines includes receiving a point cloud frame generated by a sensor configured to sense a vehicle environment. Each of the points in the frame has associated two-dimensional coordinates and an associated parameter value. The method also includes generating a normalized point cloud frame by adding interpolated points not present in the received frame, at least by, for each interpolated point, identifying one or more neighboring points having associated two-dimensional coordinates that are within a threshold distance of two-dimensional coordinates for the interpolated point, and calculating an estimated parameter value of the interpolated point using, for each of the identified neighboring points, a distance between the two-dimensional coordinates and the parameter value associated with the identified neighboring point.Type: GrantFiled: October 31, 2018Date of Patent: July 2, 2019Assignee: Luminar Technologies, Inc.Inventors: Benjamin Englard, Eric C. Danziger
-
Publication number: 20190180502Abstract: A method for processing point clouds having variable spatial distributions of scan lines includes receiving a point cloud frame generated by a sensor configured to sense an environment through which a vehicle is moving. The point cloud frame includes scan lines arranged according to a particular spatial distribution. The method also includes either generating an enhanced point cloud frame with a larger number of points than the received point cloud frame, or constructing, by one or more processors and based on points of the received point cloud frame, a three-dimensional mesh. The method also includes generating, by performing an interpolation function on the enhanced point cloud frame or a virtual surface provided by the three-dimensional mesh, a normalized point cloud frame, and generating, using the normalized point cloud frame, signals descriptive of a current state of the environment through which the vehicle is moving.Type: ApplicationFiled: October 31, 2018Publication date: June 13, 2019Inventors: Benjamin Englard, Eric C. Danziger, Austin K. Russell
-
Publication number: 20190176841Abstract: A method for controlling a vehicle based on sensor data having variable sensor parameter settings includes receiving sensor data generated by a vehicle sensor while the sensor is configured with a first sensor parameter setting. The method also includes receiving an indicator specifying the first sensor parameter setting, and selecting, based on the received indicator, one of a plurality of neural networks of a perception component, each neural network having been trained using training data corresponding to a different sensor parameter setting. The method also includes generating signals descriptive of a current state of the environment using the selected neural network and based on the received sensor data. The method further includes generating driving decisions based on the signals descriptive of the current state of the environment, and causing one or more operational subsystems of the vehicle to maneuver the vehicle in accordance with the generated driving decisions.Type: ApplicationFiled: October 31, 2018Publication date: June 13, 2019Inventors: Benjamin Englard, Eric C. Danziger