Patents by Inventor Amir Akbarzadeh

Amir Akbarzadeh has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230357076
    Abstract: An end-to-end system for data generation, map creation using the generated data, and localization to the created map is disclosed. Mapstreams—or streams of sensor data, perception outputs from deep neural networks (DNNs), and/or relative trajectory data—corresponding to any number of drives by any number of vehicles may be generated and uploaded to the cloud. The mapstreams may be used to generate map data—and ultimately a fused high definition (HD) map—that represents data generated over a plurality of drives. When localizing to the fused HD map, individual localization results may be generated based on comparisons of real-time data from a sensor modality to map data corresponding to the same sensor modality. This process may be repeated for any number of sensor modalities and the results may be fused together to determine a final fused localization result.
    Type: Application
    Filed: May 2, 2023
    Publication date: November 9, 2023
    Inventors: Michael Kroepfl, Amir Akbarzadeh, Ruchi Bhargava, Viabhav Thukral, Neda Cvijetic, Vadim Cugunovs, David Nister, Birgit Henke, Ibrahim Eden, Youding Zhu, Michael Grabner, Ivana Stojanovic, Yu Sheng, Jeffrey Liu, Enliang Zheng, Jordan Marr, Andrew Carley
  • Publication number: 20230341234
    Abstract: In various examples, a lane planner for generating lane planner output data based on a state and probabilistic action space is provided. A driving system—that operates based on a hierarchical drive planning framework—includes the lane planner and other planning and control components. The lane planner processes lane planner input data (e.g., large lane graph, source node, target node) to generate lane planner output data (e.g., expected time rewards). The driving system can also include a route planner (e.g., a first planning layer) that operates to provide the lane planner input data to the lane planner. The lane planner operates as second planning layer that processes the lane planner input data based at least in part on a state and probabilistic action space of the large lane graph and calculates a time cost associated with navigating from a source node to a target node in the large lane graph.
    Type: Application
    Filed: April 20, 2022
    Publication date: October 26, 2023
    Inventors: David Nister, Hon Leung Lee, Yizhou Wang, Rotem Aviv, Birgit Henke, Julia Ng, Amir Akbarzadeh
  • Publication number: 20230324194
    Abstract: Embodiments of the present disclosure relate to a method of translating routes between maps. The method may include obtaining a graph based on data of an area. The graph may include one or more nodes representing different locations along one or more navigable paths as defined by the map. The method may also include obtaining one or more waypoints that define a route to traverse in the area and selecting, from the nodes, one or more path nodes based on locations of the path nodes corresponding to locations of the way points. The selected path nodes may define a path in the data that corresponds to the route.
    Type: Application
    Filed: April 12, 2022
    Publication date: October 12, 2023
    Inventors: Amir AKBARZADEH, Raul Correal TEZANOS, Hon Leung LEE
  • Publication number: 20230294726
    Abstract: One or more embodiments of the present disclosure relate to aligning sensor data. In some embodiments, the aligning may be used for performing localization. In these or other embodiments, the aligning may be used for map creation.
    Type: Application
    Filed: March 21, 2022
    Publication date: September 21, 2023
    Inventors: Amir AKBARZADEH, Andrew CARLEY, Birgit HENKE, Si LU, Ivana STOJANOVIC, Jugnu AGRAWAL, Michael KROEPFL, Yu SHENG
  • Publication number: 20230296758
    Abstract: Embodiments of the present disclosure relate to generating RADAR (RAdio Detection And Ranging) point clouds based on RADAR data obtained from one or more RADAR sensors disposed on one or more ego-machines. In these or other embodiments, the RADAR point clouds may be used to generate map data. Additionally or alternatively, the RADAR point clouds may be used for performing localization.
    Type: Application
    Filed: March 21, 2022
    Publication date: September 21, 2023
    Inventors: Amir AKBARZADEH, Andrew CARLEY, Birgit HENKE, Si LU, Ivana STOJANOVIC, Jugnu AGRAWAL, Michael KROEPFL, Yu SHENG, David NISTER, Enliang ZHENG
  • Publication number: 20230296756
    Abstract: One or more embodiments of the present disclosure relate to generating RADAR (RAdio Detection And Ranging) point clouds based on RADAR data obtained from one or more RADAR sensors disposed on one or more ego-machines. In these or other embodiments, the RADAR point clouds may be communicated to a distributed map system that is configured to generate map data based on the RADAR point clouds. In some embodiments of the present disclosure, certain compression operations may be performed on the RADAR point clouds to reduce the amount of data that is communicated from the ego-machines to the map system.
    Type: Application
    Filed: March 21, 2022
    Publication date: September 21, 2023
    Inventors: Amir AKBARZADEH, Andrew CARLEY, Birgit HENKE, Si LU, Ivana STOJANOVIC, Jugnu AGRAWAL, Michael KROEPFL, Yu SHENG, David NISTER, Enliang ZHENG, Niharika ARORA
  • Publication number: 20230296748
    Abstract: One or more embodiments of the present disclosure relate to generation of map data. In these or other embodiments, the generation of the map data may include determining whether objects indicated by the sensor data are static objects or dynamic objects. Additionally or alternatively, sensor data may be removed or included in the map data based on determinations as to whether it corresponds to static objects or dynamic objects.
    Type: Application
    Filed: March 21, 2022
    Publication date: September 21, 2023
    Inventors: Amir AKBARZADEH, Andrew CARLEY, Birgit HENKE, Si LU, Ivana STOJANOVIC, Jugnu AGRAWAL, Michael KROEPFL, Yu SHENG, David NISTER, Enliang ZHENG
  • Patent number: 11713978
    Abstract: An end-to-end system for data generation, map creation using the generated data, and localization to the created map is disclosed. Mapstreams—or streams of sensor data, perception outputs from deep neural networks (DNNs), and/or relative trajectory data—corresponding to any number of drives by any number of vehicles may be generated and uploaded to the cloud. The mapstreams may be used to generate map data—and ultimately a fused high definition (HD) map—that represents data generated over a plurality of drives. When localizing to the fused HD map, individual localization results may be generated based on comparisons of real-time data from a sensor modality to map data corresponding to the same sensor modality. This process may be repeated for any number of sensor modalities and the results may be fused together to determine a final fused localization result.
    Type: Grant
    Filed: August 31, 2020
    Date of Patent: August 1, 2023
    Assignee: NVIDIA Corporation
    Inventors: Amir Akbarzadeh, David Nister, Ruchi Bhargava, Birgit Henke, Ivana Stojanovic, Yu Sheng
  • Patent number: 11698272
    Abstract: An end-to-end system for data generation, map creation using the generated data, and localization to the created map is disclosed. Mapstreams—or streams of sensor data, perception outputs from deep neural networks (DNNs), and/or relative trajectory data—corresponding to any number of drives by any number of vehicles may be generated and uploaded to the cloud. The mapstreams may be used to generate map data—and ultimately a fused high definition (HD) map—that represents data generated over a plurality of drives. When localizing to the fused HD map, individual localization results may be generated based on comparisons of real-time data from a sensor modality to map data corresponding to the same sensor modality. This process may be repeated for any number of sensor modalities and the results may be fused together to determine a final fused localization result.
    Type: Grant
    Filed: August 31, 2020
    Date of Patent: July 11, 2023
    Assignee: NVIDIA Corporation
    Inventors: Michael Kroepfl, Amir Akbarzadeh, Ruchi Bhargava, Vaibhav Thukral, Neda Cvijetic, Vadim Cugunovs, David Nister, Birgit Henke, Ibrahim Eden, Youding Zhu, Michael Grabner, Ivana Stojanovic, Yu Sheng, Jeffrey Liu, Enliang Zheng, Jordan Marr, Andrew Carley
  • Publication number: 20230204383
    Abstract: An end-to-end system for data generation, map creation using the generated data, and localization to the created map is disclosed. Mapstreams – or streams of sensor data, perception outputs from deep neural networks (DNNs), and/or relative trajectory data – corresponding to any number of drives by any number of vehicles may be generated and uploaded to the cloud. The mapstreams may be used to generate map data – and ultimately a fused high definition (HD) map – that represents data generated over a plurality of drives. When localizing to the fused HD map, individual localization results may be generated based on comparisons of real-time data from a sensor modality to map data corresponding to the same sensor modality. This process may be repeated for any number of sensor modalities and the results may be fused together to determine a final fused localization result.
    Type: Application
    Filed: February 28, 2023
    Publication date: June 29, 2023
    Inventors: Amir Akbarzadeh, David Nister, Ruchi Bhargava, Birgit Henke, Ivana Stojanovic, Yu Sheng
  • Publication number: 20220341750
    Abstract: In various examples, health of a high definition (HD) map may be monitored to determine whether inaccuracies exist in one or more layers of the HD map. For example, as one or more vehicles rely on the HD map to traverse portions of an environment, disagreements between perception of the one or more vehicles, map layers of the HD map, and/or other disagreement types may be identified and aggregated. Where errors are identified that indicate a drop in health of the HD map, updated data may be crowdsourced from one or more vehicles corresponding to a location of disagreement within the HD map, and the updated data may be used to update, verify, and validate the HD map.
    Type: Application
    Filed: April 21, 2022
    Publication date: October 27, 2022
    Inventors: Amir Akbarzadeh, Ruchi Bhargava, Vaibhav Thukral
  • Publication number: 20220333950
    Abstract: Systems and methods for vehicle-based determination of HD map update information. Sensor-equipped vehicles may determine locations of various detected objects relative to the vehicles. Vehicles may also determine the location of reference objects relative to the vehicles, where the location of the reference objects in an absolute coordinate system is also known. The absolute coordinates of various detected objects may then be determined from the absolute position of the reference objects and the locations of other objects relative to the reference objects. Newly-determined absolute locations of detected objects may then be transmitted to HD map services for updating.
    Type: Application
    Filed: April 19, 2021
    Publication date: October 20, 2022
    Inventors: Amir Akbarzadeh, Ruchita Bhargava, Bhaven Dedhia, Rambod Jacoby, Jeffrey Liu, Vaibhav Thukral
  • Patent number: 11381758
    Abstract: A preferred method of acquiring virtual or augmented reality (VAR) scenes can include at a plurality of locations of interest, providing one or more users with a predetermined pattern for image acquisition with an image capture device and for each of the one or more users, in response to a user input, acquiring at least one image at the location of interest. The method of the preferred embodiment can also include for each of the one or more users, in response to the acquisition of at least one image, providing the user with feedback to ensure a complete acquisition of the virtual or augmented reality scene; and receiving at a remote database, from each of the one or more users, one or more VAR scenes. One variation of the method of the preferred embodiment can include providing game mechanics to promote proper image acquisition and promote competition between users.
    Type: Grant
    Filed: December 30, 2020
    Date of Patent: July 5, 2022
    Assignee: Dropbox, Inc.
    Inventors: Benjamin Zeis Newhouse, Terrence Edward McArdle, Amir Akbarzadeh
  • Publication number: 20220144304
    Abstract: An architecture can generate lane graphs or path determinations, for devices such as robots or autonomous vehicles, using multiple sources of data while satisfying applicable requirements and regulations for operation. A system can fuse together data from multiple sources useful to determine localization. To ensure safety compliance, this fused data is compared against data from systems where safety is trusted and, as long as at least two comparators agree with the fused localization data, the fused localization data can be used and verified to be safety regulation compliant. This system can also fuse together available information useful for lane perception. This fused data is compared against data from systems where the safety is trusted, and as long as at least two comparators for these safety-compliant systems agree with the fused lane graph data, then the fused lane graph data can be provided for navigation and verified to be regulation compliant.
    Type: Application
    Filed: September 23, 2021
    Publication date: May 12, 2022
    Inventors: Aidin Ehsanibenafati, Jonas Nilsson, Amir Akbarzadeh, Hae Jong Seo
  • Publication number: 20210120191
    Abstract: A preferred method of acquiring virtual or augmented reality (VAR) scenes can include at a plurality of locations of interest, providing one or more users with a predetermined pattern for image acquisition with an image capture device and for each of the one or more users, in response to a user input, acquiring at least one image at the location of interest. The method of the preferred embodiment can also include for each of the one or more users, in response to the acquisition of at least one image, providing the user with feedback to ensure a complete acquisition of the virtual or augmented reality scene; and receiving at a remote database, from each of the one or more users, one or more VAR scenes. One variation of the method of the preferred embodiment can include providing game mechanics to promote proper image acquisition and promote competition between users.
    Type: Application
    Filed: December 30, 2020
    Publication date: April 22, 2021
    Inventors: Benjamin Zeis Newhouse, Terrence Edward McArdle, Amir Akbarzadeh
  • Publication number: 20210063199
    Abstract: An end-to-end system for data generation, map creation using the generated data, and localization to the created map is disclosed. Mapstreams—or streams of sensor data, perception outputs from deep neural networks (DNNs), and/or relative trajectory data—corresponding to any number of drives by any number of vehicles may be generated and uploaded to the cloud. The mapstreams may be used to generate map data—and ultimately a fused high definition (HD) map—that represents data generated over a plurality of drives. When localizing to the fused HD map, individual localization results may be generated based on comparisons of real-time data from a sensor modality to map data corresponding to the same sensor modality. This process may be repeated for any number of sensor modalities and the results may be fused together to determine a final fused localization result.
    Type: Application
    Filed: August 31, 2020
    Publication date: March 4, 2021
    Inventors: Amir Akbarzadeh, David Nister, Ruchi Bhargava, Birgit Henke, Ivana Stojanovic, Yu Sheng
  • Publication number: 20210063200
    Abstract: An end-to-end system for data generation, map creation using the generated data, and localization to the created map is disclosed. Mapstreams—or streams of sensor data, perception outputs from deep neural networks (DNNs), and/or relative trajectory data—corresponding to any number of drives by any number of vehicles may be generated and uploaded to the cloud. The mapstreams may be used to generate map data—and ultimately a fused high definition (HD) map—that represents data generated over a plurality of drives. When localizing to the fused HD map, individual localization results may be generated based on comparisons of real-time data from a sensor modality to map data corresponding to the same sensor modality. This process may be repeated for any number of sensor modalities and the results may be fused together to determine a final fused localization result.
    Type: Application
    Filed: August 31, 2020
    Publication date: March 4, 2021
    Inventors: Michael Kroepfl, Amir Akbarzadeh, Ruchi Bhargava, Vaibhav Thukral, Neda Cvijetic, Vadim Cugunovs, David Nister, Birgit Henke, Ibrahim Eden, Youding Zhu, Michael Grabner, Ivana Stojanovic, Yu Sheng, Jeffrey Liu, Enliang Zheng, Jordan Marr, Andrew Carley
  • Patent number: 10893219
    Abstract: A preferred method of acquiring virtual or augmented reality (VAR) scenes can include at a plurality of locations of interest, providing one or more users with a predetermined pattern for image acquisition with an image capture device and for each of the one or more users, in response to a user input, acquiring at least one image at the location of interest. The method of the preferred embodiment can also include for each of the one or more users, in response to the acquisition of at least one image, providing the user with feedback to ensure a complete acquisition of the virtual or augmented reality scene; and receiving at a remote database, from each of the one or more users, one or more VAR scenes. One variation of the method of the preferred embodiment can include providing game mechanics to promote proper image acquisition and promote competition between users.
    Type: Grant
    Filed: September 27, 2019
    Date of Patent: January 12, 2021
    Assignee: DROPBOX, INC.
    Inventors: Benjamin Zeis Newhouse, Terrence Edward McArdle, Amir Akbarzadeh
  • Publication number: 20200324795
    Abstract: In various examples, training sensor data generated by one or more sensors of autonomous machines may be localized to high definition (HD) map data to augment and/or generate ground truth data—e.g., automatically, in embodiments. The ground truth data may be associated with the training sensor data for training one or more deep neural networks (DNNs) to compute outputs corresponding to autonomous machine operations—such as object or feature detection, road feature detection and classification, wait condition identification and classification, etc. As a result, the HD map data may be leveraged during training such that the DNNs—in deployment—may aid autonomous machines in navigating environments safely without relying on HD map data to do so.
    Type: Application
    Filed: April 3, 2020
    Publication date: October 15, 2020
    Inventors: Mariusz Bojarski, Urs Muller, Bernhard Firner, Amir Akbarzadeh
  • Publication number: 20200045241
    Abstract: A preferred method of acquiring virtual or augmented reality (VAR) scenes can include at a plurality of locations of interest, providing one or more users with a predetermined pattern for image acquisition with an image capture device and for each of the one or more users, in response to a user input, acquiring at least one image at the location of interest. The method of the preferred embodiment can also include for each of the one or more users, in response to the acquisition of at least one image, providing the user with feedback to ensure a complete acquisition of the virtual or augmented reality scene; and receiving at a remote database, from each of the one or more users, one or more VAR scenes. One variation of the method of the preferred embodiment can include providing game mechanics to promote proper image acquisition and promote competition between users.
    Type: Application
    Filed: September 27, 2019
    Publication date: February 6, 2020
    Inventors: Benjamin Zeis Newhouse, Terrence Edward McArdle, Amir Akbarzadeh