Patents by Inventor Lukas Platinsky
Lukas Platinsky has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12139164Abstract: Examples disclosed herein involve a computing system configured to (i) obtain first sensor data captured by a first sensor of a vehicle during a given period of operation of the vehicle (ii) obtain second sensor data captured by a second sensor of the vehicle during the given period of operation of the vehicle, (iii) based on the first sensor data, localize the first sensor within a first coordinate frame of a first map layer, (iv) based on the second sensor data, localize the second sensor within a second coordinate frame of a second map layer, (v) based on a known transformation between the first coordinate frame and the second coordinate frame, determine respective poses for the first sensor and the second sensor in a common coordinate frame, and (vi) determine (a) a translation and (b) a rotation between the respective poses for the first and second sensors in the common coordinate frame.Type: GrantFiled: December 18, 2020Date of Patent: November 12, 2024Assignee: Lyft, Inc.Inventors: Lei Zhang, Li Jiang, Lukas Platinsky, Karim Tarek Mahmoud Elsayed Ahmed Shaban
-
Patent number: 12111178Abstract: The present invention relates to the efficient use of both local and remote computational resources and communication bandwidth to provide distributed environment mapping using a plurality of mobile sensor-equipped devices. According to a first aspect, there is provided a method of determining a global position of one or more landmarks on a global map, the method comprising the steps of determining one or more differences between sequential sensor data captured by one or more moving devices; determining one or more relative localisation landmark positions with respect to the one or more moving devices; determining relative device poses based one or more differences between sequential sensor data relative to the one or more relative localisation landmark positions; and determining a correlation between each device pose and the one or more relative localisation landmarks positions.Type: GrantFiled: December 9, 2022Date of Patent: October 8, 2024Assignee: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky
-
Publication number: 20240118083Abstract: The present invention relates to method of localisation for devices. More particularly, it relates the use of both local and remote resources to provide substantially real time localisation at a device. According to an aspect, there is provided a method of determining a location of a device having one or more sensors comprising the steps of: sending a localisation request to a server system, the localisation request comprising at least a portion of data from the one or more sensors; receiving localisation data from the server system in response to the localisation request; and determining a location of the device from the received localisation data. Optionally, the method includes the further step of estimating a location of the device based on data from the one or more sensors and wherein the step of determining a location of the device includes determining the location of the device using the estimated location.Type: ApplicationFiled: August 7, 2023Publication date: April 11, 2024Applicant: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky
-
Patent number: 11790548Abstract: The present invention relates to a method and system for automatic localisation of static objects in an urban environment. More particularly, the present invention relates to the use of noisy 2-Dimensional (2D) image data to identify and determine 3-Dimensional (3D) positions of objects in large scale urban or city environments. Aspects and/or embodiments seek to provide a method, system, and vehicle for automatically locating static 3D objects in urban environments by using a voting-based triangulation technique. Aspects and/or embodiments also provide a method for updating map data after automatically new 3D static objects in an environment.Type: GrantFiled: November 30, 2022Date of Patent: October 17, 2023Assignee: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky, Giacomo Dabisias
-
Patent number: 11761766Abstract: The present invention relates to method of localisation for devices. More particularly, it relates the use of both local and remote resources to provide substantially real time localisation at a device. According to an aspect, there is provided a method of determining a location of a device having one or more sensors comprising the steps of: sending a localisation request to a server system, the localisation request comprising at least a portion of data from the one or more sensors; receiving localisation data from the server system in response to the localisation request; and determining a location of the device from the received localisation data. Optionally, the method includes the further step of estimating a location of the device based on data from the one or more sensors and wherein the step of determining a location of the device includes determining the location of the device using the estimated location.Type: GrantFiled: December 21, 2017Date of Patent: September 19, 2023Assignee: Blue Vision Labs UK LimitedInventors: Peter Ondruska, Lukas Platinsky
-
Publication number: 20230186499Abstract: The present invention relates to a method and system for automatic localisation of static objects in an urban environment. More particularly, the present invention relates to the use of noisy 2-Dimensional (2D) image data to identify and determine 3-Dimensional (3D) positions of objects in large scale urban or city environments. Aspects and/or embodiments seek to provide a method, system, and vehicle for automatically locating static 3D objects in urban environments by using a voting-based triangulation technique. Aspects and/or embodiments also provide a method for updating map data after automatically new 3D static objects in an environment.Type: ApplicationFiled: November 30, 2022Publication date: June 15, 2023Applicant: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky, Giacomo Dabisias
-
Publication number: 20230175862Abstract: The present invention relates to the efficient use of both local and remote computational resources and communication bandwidth to provide distributed environment mapping using a plurality of mobile sensor-equipped devices. According to a first aspect, there is provided a method of determining a global position of one or more landmarks on a global map, the method comprising the steps of determining one or more differences between sequential sensor data captured by one or more moving devices; determining one or more relative localisation landmark positions with respect to the one or more moving devices; determining relative device poses based one or more differences between sequential sensor data relative to the one or more relative localisation landmark positions; and determining a correlation between each device pose and the one or more relative localisation landmarks positions.Type: ApplicationFiled: December 9, 2022Publication date: June 8, 2023Applicant: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky
-
Patent number: 11549817Abstract: The present invention relates to the efficient use of both local and remote computational resources and communication bandwidth to provide distributed environment mapping using a plurality of mobile sensor-equipped devices. According to a first aspect, there is provided a method of determining a global position of one or more landmarks on a global map, the method comprising the steps of determining one or more differences between sequential sensor data captured by one or more moving devices; determining one or more relative localisation landmark positions with respect to the one or more moving devices; determining relative device poses based one or more differences between sequential sensor data relative to the one or more relative localisation landmark positions; and determining a correlation between each device pose and the one or more relative localisation landmarks positions.Type: GrantFiled: October 8, 2020Date of Patent: January 10, 2023Assignee: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky
-
Patent number: 11538182Abstract: The present invention relates to a method and system for automatic localisation of static objects in an urban environment. More particularly, the present invention relates to the use of noisy 2-Dimensional (2D) image data to identify and determine 3-Dimensional (3D) positions of objects in large scale urban or city environments. Aspects and/or embodiments seek to provide a method, system, and vehicle for automatically locating static 3D objects in urban environments by using a voting-based triangulation technique. Aspects and/or embodiments also provide a method for updating map data after automatically new 3D static objects in an environment.Type: GrantFiled: June 4, 2020Date of Patent: December 27, 2022Assignee: Blue Vision Labs UK LimitedInventors: Peter Ondruska, Lukas Platinsky, Giacomo Dabisias
-
Patent number: 11503428Abstract: Systems, methods, and non-transitory computer-readable medium can receive a plurality of localization requests from a plurality of devices, each of the plurality of localization requests comprising sensor data captured by one or more sensors of the plurality of devices. Localization data can be sent to each device of the plurality of devices in response to receiving the plurality of localization requests. A plurality of pose data can be received from a first device and a second device of the plurality of devices. The plurality of pose data can include a position and orientation for each of the first and second devices based on the sensor data and the received localization data. At least one received pose data of the plurality of received pose data can be sent to at least the first device of the plurality of devices.Type: GrantFiled: October 2, 2020Date of Patent: November 15, 2022Assignee: Blue Vision Labs UK LimitedInventors: Peter Ondruska, Lukas Platinsky
-
Publication number: 20220194412Abstract: Examples disclosed herein involve a computing system configured to (i) obtain first sensor data captured by a first sensor of a vehicle during a given period of operation of the vehicle (ii) obtain second sensor data captured by a second sensor of the vehicle during the given period of operation of the vehicle, (iii) based on the first sensor data, localize the first sensor within a first coordinate frame of a first map layer, (iv) based on the second sensor data, localize the second sensor within a second coordinate frame of a second map layer, (v) based on a known transformation between the first coordinate frame and the second coordinate frame, determine respective poses for the first sensor and the second sensor in a common coordinate frame, and (vi) determine (a) a translation and (b) a rotation between the respective poses for the first and second sensors in the common coordinate frame.Type: ApplicationFiled: December 18, 2020Publication date: June 23, 2022Inventors: Lei Zhang, Li Jiang, Lukas Platinsky, Karim Tarek Mahmoud Elsayed Ahmed Shaban
-
Patent number: 11299151Abstract: The present invention relates to a method and system for accurately predicting future trajectories of observed objects in dense and ever-changing city environments. More particularly, the present invention relates to substantially continuously tracking and estimating the future movements of an observed object. As an example, an observed object may be a moving vehicle, for example along a path or road. Aspects and/or embodiments seek to provide an end to end method and system for substantially continuously tracking and predicting future movements of a newly observed object, such as a vehicle, using motion prior data extracted from map data.Type: GrantFiled: April 29, 2020Date of Patent: April 12, 2022Assignee: Woven Planet North America, Inc.Inventors: Peter Ondruska, Lukas Platinsky, Suraj Mannakunnel Surendran
-
Patent number: 11205298Abstract: There is provided a method for creating a voxel occupancy model. The voxel occupancy model is representative of a region of space which can be described using a three-dimensional voxel array. The region of space contains at least part of an object. The method comprises receiving first image data, the first image data being representative of a first view of the at least part of an object and comprising first image location data, and receiving second image data, the second image data being representative of a second view of the at least part of an object and comprising second image location data. The method also comprises determining a first descriptor, the first descriptor describing a property of a projection of a first voxel of the voxel array in the first image data, and determining a second descriptor, the second descriptor describing a property of a projection of the first voxel in the second image data.Type: GrantFiled: October 3, 2019Date of Patent: December 21, 2021Assignee: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky
-
Publication number: 20210144513Abstract: Systems, methods, and non-transitory computer-readable medium can receive a plurality of localization requests from a plurality of devices, each of the plurality of localization requests comprising sensor data captured by one or more sensors of the plurality of devices. Localization data can be sent to each device of the plurality of devices in response to receiving the plurality of localization requests. A plurality of pose data can be received from a first device and a second device of the plurality of devices. The plurality of pose data can include a position and orientation for each of the first and second devices based on the sensor data and the received localization data. At least one received pose data of the plurality of received pose data can be sent to at least the first device of the plurality of devices.Type: ApplicationFiled: October 2, 2020Publication date: May 13, 2021Applicant: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky
-
Publication number: 20210140773Abstract: The present invention relates to the efficient use of both local and remote computational resources and communication bandwidth to provide distributed environment mapping using a plurality of mobile sensor-equipped devices. According to a first aspect, there is provided a method of determining a global position of one or more landmarks on a global map, the method comprising the steps of determining one or more differences between sequential sensor data captured by one or more moving devices; determining one or more relative localisation landmark positions with respect to the one or more moving devices; determining relative device poses based one or more differences between sequential sensor data relative to the one or more relative localisation landmark positions; and determining a correlation between each device pose and the one or more relative localisation landmarks positions.Type: ApplicationFiled: October 8, 2020Publication date: May 13, 2021Applicant: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky
-
Publication number: 20200388047Abstract: The present invention relates to a method and system for automatic localisation of static objects in an urban environment. More particularly, the present invention relates to the use of noisy 2-Dimensional (2D) image data to identify and determine 3-Dimensional (3D) positions of objects in large scale urban or city environments. Aspects and/or embodiments seek to provide a method, system, and vehicle for automatically locating static 3D objects in urban environments by using a voting-based triangulation technique. Aspects and/or embodiments also provide a method for updating map data after automatically new 3D static objects in an environment.Type: ApplicationFiled: June 4, 2020Publication date: December 10, 2020Applicant: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky, Giacomo Dabisias
-
Publication number: 20200377084Abstract: The present invention relates to a method and system for accurately predicting future trajectories of observed objects in dense and ever-changing city environments. More particularly, the present invention relates to substantially continuously tracking and estimating the future movements of an observed object. As an example, an observed object may be a moving vehicle, for example along a path or road. Aspects and/or embodiments seek to provide an end to end method and system for substantially continuously tracking and predicting future movements of a newly observed object, such as a vehicle, using motion prior data extracted from map data.Type: ApplicationFiled: April 29, 2020Publication date: December 3, 2020Applicant: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky, Suraj Mannakunnel Surendran
-
Patent number: 10845200Abstract: The present invention relates to the efficient use of both local and remote computational resources and communication bandwidth to provide distributed environment mapping using a plurality of mobile sensor-equipped devices. According to a first aspect, there is provided a method of determining a global position of one or more landmarks on a global map, the method comprising the steps of determining one or more differences between sequential sensor data captured by one or more moving devices; determining one or more relative localisation landmark positions with respect to the one or more moving devices; determining relative device poses based one or more differences between sequential sensor data relative to the one or more relative localisation landmark positions; and determining a correlation between each device pose and the one or more relative localisation landmarks positions.Type: GrantFiled: October 15, 2019Date of Patent: November 24, 2020Assignee: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky
-
Patent number: 10798526Abstract: Systems, methods, and non-transitory computer-readable medium can receive a plurality of localization requests from a plurality of devices, each of the plurality of localization requests comprising sensor data captured by one or more sensors of the plurality of devices. Localization data can be sent to each device of the plurality of devices in response to receiving the plurality of localization requests. A plurality of pose data can be received from a first device and a second device of the plurality of devices. The plurality of pose data can include a position and orientation for each of the first and second devices based on the sensor data and the received localization data. At least one received pose data of the plurality of received pose data can be sent to at least the first device of the plurality of devices.Type: GrantFiled: October 10, 2019Date of Patent: October 6, 2020Assignee: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky
-
Patent number: 10706576Abstract: The present invention relates to a method and system for automatic localisation of static objects in an urban environment. More particularly, the present invention relates to the use of noisy 2-Dimensional (2D) image data to identify and determine 3-Dimensional (3D) positions of objects in large scale urban or city environments. Aspects and/or embodiments seek to provide a method, system, and vehicle for automatically locating static 3D objects in urban environments by using a voting-based triangulation technique. Aspects and/or embodiments also provide a method for updating map data after automatically new 3D static objects in an environment.Type: GrantFiled: July 1, 2019Date of Patent: July 7, 2020Assignee: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky, Giacomo Dabisias