Patents by Inventor PETER ONDRUSKA

PETER ONDRUSKA has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11972576
    Abstract: Examples disclosed herein may involve a computing system that is operable to (i) receive one or more images related to a global map having a plurality of overlapping map segments, wherein each of the plurality of overlapping map segments overlaps with one or more neighboring map segments, (ii) based on a preliminary location determination for the one or more images, identify at least a first overlapping map segment of the plurality of overlapping map segments that corresponds to the one or more images, (iii) generate a reconstruction of the first identified overlapping map segment based on the one or more images, and (iv) fuse the generated reconstruction of the first identified overlapping map segment together with the first identified overlapping map segment's one or more neighboring map segments based on overlapping map portions between the generated reconstruction and the first identified overlapping map segment's one or more neighboring map segments.
    Type: Grant
    Filed: June 30, 2020
    Date of Patent: April 30, 2024
    Assignee: Lyft, Inc.
    Inventors: Peter Ondruska, Luca Del Pero, Ivan Katanic
  • Publication number: 20240118083
    Abstract: The present invention relates to method of localisation for devices. More particularly, it relates the use of both local and remote resources to provide substantially real time localisation at a device. According to an aspect, there is provided a method of determining a location of a device having one or more sensors comprising the steps of: sending a localisation request to a server system, the localisation request comprising at least a portion of data from the one or more sensors; receiving localisation data from the server system in response to the localisation request; and determining a location of the device from the received localisation data. Optionally, the method includes the further step of estimating a location of the device based on data from the one or more sensors and wherein the step of determining a location of the device includes determining the location of the device using the estimated location.
    Type: Application
    Filed: August 7, 2023
    Publication date: April 11, 2024
    Applicant: BLUE VISION LABS UK LIMITED
    Inventors: Peter Ondruska, Lukas Platinsky
  • Patent number: 11954151
    Abstract: Embodiments are disclosed for using natural language processing (NLP) to manage security video data. A method of using NLP to search security video data includes receiving, by a surveillance video query system, a text query. A query embedding corresponding to the text query is obtained using a text query model. One or more matching frame embeddings that match the query embedding are identified in a vector database. Matching surveillance video data corresponding to the one or more matching frame embeddings is then obtained from a surveillance video data store. The matching surveillance video data is returned in response to receipt of the text query.
    Type: Grant
    Filed: September 6, 2023
    Date of Patent: April 9, 2024
    Assignee: Coram AI, Inc.
    Inventors: Ashesh Jain, Peter Ondruska, Yawei Ye, Qiangui Huang
  • Patent number: 11790548
    Abstract: The present invention relates to a method and system for automatic localisation of static objects in an urban environment. More particularly, the present invention relates to the use of noisy 2-Dimensional (2D) image data to identify and determine 3-Dimensional (3D) positions of objects in large scale urban or city environments. Aspects and/or embodiments seek to provide a method, system, and vehicle for automatically locating static 3D objects in urban environments by using a voting-based triangulation technique. Aspects and/or embodiments also provide a method for updating map data after automatically new 3D static objects in an environment.
    Type: Grant
    Filed: November 30, 2022
    Date of Patent: October 17, 2023
    Assignee: BLUE VISION LABS UK LIMITED
    Inventors: Peter Ondruska, Lukas Platinsky, Giacomo Dabisias
  • Patent number: 11761766
    Abstract: The present invention relates to method of localisation for devices. More particularly, it relates the use of both local and remote resources to provide substantially real time localisation at a device. According to an aspect, there is provided a method of determining a location of a device having one or more sensors comprising the steps of: sending a localisation request to a server system, the localisation request comprising at least a portion of data from the one or more sensors; receiving localisation data from the server system in response to the localisation request; and determining a location of the device from the received localisation data. Optionally, the method includes the further step of estimating a location of the device based on data from the one or more sensors and wherein the step of determining a location of the device includes determining the location of the device using the estimated location.
    Type: Grant
    Filed: December 21, 2017
    Date of Patent: September 19, 2023
    Assignee: Blue Vision Labs UK Limited
    Inventors: Peter Ondruska, Lukas Platinsky
  • Patent number: 11731652
    Abstract: Systems, methods, and non-transitory computer-readable media can determine contextual information associated with an environment including a vehicle and at least one agent for generating a computer simulation based on the environment. One or more behavior constraints for the at least one agent can be determined based on the contextual information. A set of trajectories can be generated based on the one or more behavior constraints. A trajectory can be selected from the set of trajectories based on determining that the trajectory satisfies one or more predetermined criteria. The computer simulation can be generated, wherein the computer simulation includes monitoring driving behavior of the vehicle in response to the vehicle interacting with the at least one agent based on the selected trajectory.
    Type: Grant
    Filed: December 15, 2020
    Date of Patent: August 22, 2023
    Assignee: Woven Planet North America, Inc.
    Inventors: David Andrew Dolben, Chih Chi Hu, Peter Ondruska
  • Publication number: 20230186499
    Abstract: The present invention relates to a method and system for automatic localisation of static objects in an urban environment. More particularly, the present invention relates to the use of noisy 2-Dimensional (2D) image data to identify and determine 3-Dimensional (3D) positions of objects in large scale urban or city environments. Aspects and/or embodiments seek to provide a method, system, and vehicle for automatically locating static 3D objects in urban environments by using a voting-based triangulation technique. Aspects and/or embodiments also provide a method for updating map data after automatically new 3D static objects in an environment.
    Type: Application
    Filed: November 30, 2022
    Publication date: June 15, 2023
    Applicant: BLUE VISION LABS UK LIMITED
    Inventors: Peter Ondruska, Lukas Platinsky, Giacomo Dabisias
  • Publication number: 20230175862
    Abstract: The present invention relates to the efficient use of both local and remote computational resources and communication bandwidth to provide distributed environment mapping using a plurality of mobile sensor-equipped devices. According to a first aspect, there is provided a method of determining a global position of one or more landmarks on a global map, the method comprising the steps of determining one or more differences between sequential sensor data captured by one or more moving devices; determining one or more relative localisation landmark positions with respect to the one or more moving devices; determining relative device poses based one or more differences between sequential sensor data relative to the one or more relative localisation landmark positions; and determining a correlation between each device pose and the one or more relative localisation landmarks positions.
    Type: Application
    Filed: December 9, 2022
    Publication date: June 8, 2023
    Applicant: BLUE VISION LABS UK LIMITED
    Inventors: Peter Ondruska, Lukas Platinsky
  • Patent number: 11634161
    Abstract: In one embodiment, a method includes determining an initial cost volume associated with a plurality of potential trajectories of a vehicle in an environment based on a set of movement restrictions of the vehicle, generating a delta cost volume using the initial cost volume and environment data associated with the environment, wherein the delta cost volume is generated by determining adjustments to the initial cost volume that incorporate observed driving behavior, and scoring a trajectory of the plurality of potential trajectories for the vehicle based on the initial cost volume and the delta cost volume.
    Type: Grant
    Filed: June 24, 2020
    Date of Patent: April 25, 2023
    Assignee: Woven Planet North America, Inc.
    Inventors: Tsung-Han Lin, Sammy Omari, Peter Ondruska, Matthew Swaner Vitelli
  • Patent number: 11549817
    Abstract: The present invention relates to the efficient use of both local and remote computational resources and communication bandwidth to provide distributed environment mapping using a plurality of mobile sensor-equipped devices. According to a first aspect, there is provided a method of determining a global position of one or more landmarks on a global map, the method comprising the steps of determining one or more differences between sequential sensor data captured by one or more moving devices; determining one or more relative localisation landmark positions with respect to the one or more moving devices; determining relative device poses based one or more differences between sequential sensor data relative to the one or more relative localisation landmark positions; and determining a correlation between each device pose and the one or more relative localisation landmarks positions.
    Type: Grant
    Filed: October 8, 2020
    Date of Patent: January 10, 2023
    Assignee: BLUE VISION LABS UK LIMITED
    Inventors: Peter Ondruska, Lukas Platinsky
  • Patent number: 11538182
    Abstract: The present invention relates to a method and system for automatic localisation of static objects in an urban environment. More particularly, the present invention relates to the use of noisy 2-Dimensional (2D) image data to identify and determine 3-Dimensional (3D) positions of objects in large scale urban or city environments. Aspects and/or embodiments seek to provide a method, system, and vehicle for automatically locating static 3D objects in urban environments by using a voting-based triangulation technique. Aspects and/or embodiments also provide a method for updating map data after automatically new 3D static objects in an environment.
    Type: Grant
    Filed: June 4, 2020
    Date of Patent: December 27, 2022
    Assignee: Blue Vision Labs UK Limited
    Inventors: Peter Ondruska, Lukas Platinsky, Giacomo Dabisias
  • Patent number: 11503428
    Abstract: Systems, methods, and non-transitory computer-readable medium can receive a plurality of localization requests from a plurality of devices, each of the plurality of localization requests comprising sensor data captured by one or more sensors of the plurality of devices. Localization data can be sent to each device of the plurality of devices in response to receiving the plurality of localization requests. A plurality of pose data can be received from a first device and a second device of the plurality of devices. The plurality of pose data can include a position and orientation for each of the first and second devices based on the sensor data and the received localization data. At least one received pose data of the plurality of received pose data can be sent to at least the first device of the plurality of devices.
    Type: Grant
    Filed: October 2, 2020
    Date of Patent: November 15, 2022
    Assignee: Blue Vision Labs UK Limited
    Inventors: Peter Ondruska, Lukas Platinsky
  • Publication number: 20220185323
    Abstract: Systems, methods, and non-transitory computer-readable media can determine contextual information associated with an environment including a vehicle and at least one agent for generating a computer simulation based on the environment. One or more behavior constraints for the at least one agent can be determined based on the contextual information. A set of trajectories can be generated based on the one or more behavior constraints. A trajectory can be selected from the set of trajectories based on determining that the trajectory satisfies one or more predetermined criteria. The computer simulation can be generated, wherein the computer simulation includes monitoring driving behavior of the vehicle in response to the vehicle interacting with the at least one agent based on the selected trajectory.
    Type: Application
    Filed: December 15, 2020
    Publication date: June 16, 2022
    Applicant: Woven Planet North America, Inc.
    Inventors: David Andrew Dolben, Chih Chi Hu, Peter Ondruska
  • Patent number: 11299151
    Abstract: The present invention relates to a method and system for accurately predicting future trajectories of observed objects in dense and ever-changing city environments. More particularly, the present invention relates to substantially continuously tracking and estimating the future movements of an observed object. As an example, an observed object may be a moving vehicle, for example along a path or road. Aspects and/or embodiments seek to provide an end to end method and system for substantially continuously tracking and predicting future movements of a newly observed object, such as a vehicle, using motion prior data extracted from map data.
    Type: Grant
    Filed: April 29, 2020
    Date of Patent: April 12, 2022
    Assignee: Woven Planet North America, Inc.
    Inventors: Peter Ondruska, Lukas Platinsky, Suraj Mannakunnel Surendran
  • Publication number: 20210407069
    Abstract: Examples disclosed herein may involve a computing system that is operable to (i) receive image data captured from one or more devices, (ii) based on the image data, generate at least two batches of data corresponding to an area of a global map, wherein each batch of data comprises (a) a respective group of images from the received image data, and (b) one or more common images comprising one or more common visual features, (iii) generate a respective reconstruction of the area of the global map for each of the at least two batches of data, and (iv) fuse the respective reconstructions of the area of the global map using the one or more common visual features from the one or more common images.
    Type: Application
    Filed: June 30, 2020
    Publication date: December 30, 2021
    Inventors: Peter Ondruska, Luca Del Pero, Ivan Katanic
  • Publication number: 20210407101
    Abstract: Examples disclosed herein may involve a computing system that is operable to (i) receive one or more images related to a global map having a plurality of overlapping map segments, wherein each of the plurality of overlapping map segments overlaps with one or more neighboring map segments, (ii) based on a preliminary location determination for the one or more images, identify at least a first overlapping map segment of the plurality of overlapping map segments that corresponds to the one or more images, (iii) generate a reconstruction of the first identified overlapping map segment based on the one or more images, and (iv) fuse the generated reconstruction of the first identified overlapping map segment together with the first identified overlapping map segment's one or more neighboring map segments based on overlapping map portions between the generated reconstruction and the first identified overlapping map segment's one or more neighboring map segments.
    Type: Application
    Filed: June 30, 2020
    Publication date: December 30, 2021
    Inventors: Peter Ondruska, Luca Del Pero, Ivan Katanic
  • Publication number: 20210403045
    Abstract: In one embodiment, a method includes determining an initial cost volume associated with a plurality of potential trajectories of a vehicle in an environment based on a set of movement restrictions of the vehicle, generating a delta cost volume using the initial cost volume and environment data associated with the environment, wherein the delta cost volume is generated by determining adjustments to the initial cost volume that incorporate observed driving behavior, and scoring a trajectory of the plurality of potential trajectories for the vehicle based on t the initial cost volume and the delta cost volume.
    Type: Application
    Filed: June 24, 2020
    Publication date: December 30, 2021
    Applicant: Woven Planet North America, Inc.
    Inventors: Tsung-Han Lin, Sammy Omari, Peter Ondruska, Matthew Swaner Vitelli
  • Patent number: 11205298
    Abstract: There is provided a method for creating a voxel occupancy model. The voxel occupancy model is representative of a region of space which can be described using a three-dimensional voxel array. The region of space contains at least part of an object. The method comprises receiving first image data, the first image data being representative of a first view of the at least part of an object and comprising first image location data, and receiving second image data, the second image data being representative of a second view of the at least part of an object and comprising second image location data. The method also comprises determining a first descriptor, the first descriptor describing a property of a projection of a first voxel of the voxel array in the first image data, and determining a second descriptor, the second descriptor describing a property of a projection of the first voxel in the second image data.
    Type: Grant
    Filed: October 3, 2019
    Date of Patent: December 21, 2021
    Assignee: BLUE VISION LABS UK LIMITED
    Inventors: Peter Ondruska, Lukas Platinsky
  • Publication number: 20210140773
    Abstract: The present invention relates to the efficient use of both local and remote computational resources and communication bandwidth to provide distributed environment mapping using a plurality of mobile sensor-equipped devices. According to a first aspect, there is provided a method of determining a global position of one or more landmarks on a global map, the method comprising the steps of determining one or more differences between sequential sensor data captured by one or more moving devices; determining one or more relative localisation landmark positions with respect to the one or more moving devices; determining relative device poses based one or more differences between sequential sensor data relative to the one or more relative localisation landmark positions; and determining a correlation between each device pose and the one or more relative localisation landmarks positions.
    Type: Application
    Filed: October 8, 2020
    Publication date: May 13, 2021
    Applicant: BLUE VISION LABS UK LIMITED
    Inventors: Peter Ondruska, Lukas Platinsky
  • Publication number: 20210144513
    Abstract: Systems, methods, and non-transitory computer-readable medium can receive a plurality of localization requests from a plurality of devices, each of the plurality of localization requests comprising sensor data captured by one or more sensors of the plurality of devices. Localization data can be sent to each device of the plurality of devices in response to receiving the plurality of localization requests. A plurality of pose data can be received from a first device and a second device of the plurality of devices. The plurality of pose data can include a position and orientation for each of the first and second devices based on the sensor data and the received localization data. At least one received pose data of the plurality of received pose data can be sent to at least the first device of the plurality of devices.
    Type: Application
    Filed: October 2, 2020
    Publication date: May 13, 2021
    Applicant: BLUE VISION LABS UK LIMITED
    Inventors: Peter Ondruska, Lukas Platinsky