Patents by Inventor Lukas Platinsky
Lukas Platinsky has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10706576Abstract: The present invention relates to a method and system for automatic localisation of static objects in an urban environment. More particularly, the present invention relates to the use of noisy 2-Dimensional (2D) image data to identify and determine 3-Dimensional (3D) positions of objects in large scale urban or city environments. Aspects and/or embodiments seek to provide a method, system, and vehicle for automatically locating static 3D objects in urban environments by using a voting-based triangulation technique. Aspects and/or embodiments also provide a method for updating map data after automatically new 3D static objects in an environment.Type: GrantFiled: July 1, 2019Date of Patent: July 7, 2020Assignee: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky, Giacomo Dabisias
-
Patent number: 10696300Abstract: The present invention relates to a method and system for accurately predicting future trajectories of observed objects in dense and ever-changing city environments. More particularly, the present invention relates to the use of prior trajectories extracted from mapping data to estimate the future movement of an observed object. As an example, an observed object may be a moving vehicle. Aspects and/or embodiments seek to provide a method and system for predicting future movements of a newly observed object, such as a vehicle, using motion prior data extracted from map data.Type: GrantFiled: July 1, 2019Date of Patent: June 30, 2020Assignee: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky, Suraj Mannakunnel Surendran
-
Patent number: 10668921Abstract: The present invention relates to a method and system for accurately predicting future trajectories of observed objects in dense and ever-changing city environments. More particularly, the present invention relates to substantially continuously tracking and estimating the future movements of an observed object. As an example, an observed object may be a moving vehicle, for example along a path or road. Aspects and/or embodiments seek to provide an end to end method and system for substantially continuously tracking and predicting future movements of a newly observed object, such as a vehicle, using motion prior data extracted from map data.Type: GrantFiled: July 1, 2019Date of Patent: June 2, 2020Assignee: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky, Suraj Mannakunnel Surendran
-
Publication number: 20200132461Abstract: The present invention relates to method of localisation for devices. More particularly, it relates the use of both local and remote resources to provide substantially real time localisation at a device. According to an aspect, there is provided a method of determining a location of a device having one or more sensors comprising the steps of: sending a localisation request to a server system, the localisation request comprising at least a portion of data from the one or more sensors; receiving localisation data from the server system in response to the localisation request; and determining a location of the device from the received localisation data. Optionally, the method includes the further step of estimating a location of the device based on data from the one or more sensors and wherein the step of determining a location of the device includes determining the location of the device using the estimated location.Type: ApplicationFiled: December 21, 2017Publication date: April 30, 2020Applicant: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky
-
Publication number: 20200126289Abstract: There is provided a method for creating a voxel occupancy model. The voxel occupancy model is representative of a region of space which can be described using a three-dimensional voxel array. The region of space contains at least part of an object. The method comprises receiving first image data, the first image data being representative of a first view of the at least part of an object and comprising first image location data, and receiving second image data, the second image data being representative of a second view of the at least part of an object and comprising second image location data. The method also comprises determining a first descriptor, the first descriptor describing a property of a projection of a first voxel of the voxel array in the first image data, and determining a second descriptor, the second descriptor describing a property of a projection of the first voxel in the second image data.Type: ApplicationFiled: October 3, 2019Publication date: April 23, 2020Applicant: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky
-
Publication number: 20200080849Abstract: The present invention relates to the efficient use of both local and remote computational resources and communication bandwidth to provide distributed environment mapping using a plurality of mobile sensor-equipped devices. According to a first aspect, there is provided a method of determining a global position of one or more landmarks on a global map, the method comprising the steps of determining one or more differences between sequential sensor data captured by one or more moving devices; determining one or more relative localisation landmark positions with respect to the one or more moving devices; determining relative device poses based one or more differences between sequential sensor data relative to the one or more relative localisation landmark positions; and determining a correlation between each device pose and the one or more relative localisation landmarks positions.Type: ApplicationFiled: October 15, 2019Publication date: March 12, 2020Applicant: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky
-
Publication number: 20200068345Abstract: The present invention relates to a method of co-localisation for multiple devices. More precisely, it relates to the use of both local and remote resources to provide substantially real time capability for individual devices to determine the relative position and/or pose of the or each individual device and other devices in three-dimensional space.Type: ApplicationFiled: October 10, 2019Publication date: February 27, 2020Applicant: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky
-
Patent number: 10460511Abstract: There is provided a method for creating a voxel occupancy model. The voxel occupancy model is representative of a region of space which can be described using a three-dimensional voxel array. The region of space contains at least part of an object. The method comprises receiving first image data, the first image data being representative of a first view of the at least part of an object and comprising first image location data, and receiving second image data, the second image data being representative of a second view of the at least part of an object and comprising second image location data. The method also comprises determining a first descriptor, the first descriptor describing a property of a projection of a first voxel of the voxel array in the first image data, and determining a second descriptor, the second descriptor describing a property of a projection of the first voxel in the second image data.Type: GrantFiled: September 23, 2016Date of Patent: October 29, 2019Assignee: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky
-
Publication number: 20190323852Abstract: The present invention relates to a method and system for accurately predicting future trajectories of observed objects in dense and ever-changing city environments. More particularly, the present invention relates to substantially continuously tracking and estimating the future movements of an observed object. As an example, an observed object may be a moving vehicle, for example along a path or road. Aspects and/or embodiments seek to provide an end to end method and system for substantially continuously tracking and predicting future movements of a newly observed object, such as a vehicle, using motion prior data extracted from map data.Type: ApplicationFiled: July 1, 2019Publication date: October 24, 2019Applicant: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky
-
Publication number: 20190322275Abstract: The present invention relates to a method and system for accurately predicting future trajectories of observed objects in dense and ever-changing city environments. More particularly, the present invention relates to the use of prior trajectories extracted from mapping data to estimate the future movement of an observed object. As an example, an observed object may be a moving vehicle. Aspects and/or embodiments seek to provide a method and system for predicting future movements of a newly observed object, such as a vehicle, using motion prior data extracted from map data.Type: ApplicationFiled: July 1, 2019Publication date: October 24, 2019Applicant: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky
-
Publication number: 20190325602Abstract: The present invention relates to a method and system for automatic localisation of static objects in an urban environment. More particularly, the present invention relates to the use of noisy 2-Dimensional (2D) image data to identify and determine 3-Dimensional (3D) positions of objects in large scale urban or city environments. Aspects and/or embodiments seek to provide a method, system, and vehicle for automatically locating static 3D objects in urban environments by using a voting-based triangulation technique. Aspects and/or embodiments also provide a method for updating map data after automatically new 3D static objects in an environment.Type: ApplicationFiled: July 1, 2019Publication date: October 24, 2019Applicant: BLUE VISION LABS UK LIMITEDInventors: Peter Ondruska, Lukas Platinsky
-
Publication number: 20180357301Abstract: A search system generates one or more groups of search results of the plurality of search results based at least in part on a calculated similarity between search results of the plurality of search results. The search system communicates instructions to a user computing device causing the user computing device to render a graphical user interface, the graphical user interface comprising one or more group results of the generated one or more groups of search results, wherein each of the one or more group results comprises a timeline comprising respective indicators for each result in the group result along the timeline. The search system receives, from the user computing device, an input via the graphical user interface selecting a particular indicator corresponding to a particular search result along a particular timeline, wherein the graphical user interface is configured to display details of the particular search result.Type: ApplicationFiled: February 13, 2018Publication date: December 13, 2018Inventors: Ralf Metzger, Jiri Semecky, Peter Werner Balsiger, Piotr Maciej Buczek, Ally Gale, Lukas Platinsky
-
Publication number: 20160125320Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for information retrieval. In one aspect, a method includes receiving a flight search query, the flight search query identifying a plurality of flight parameters including a departure location, a destination location, and at least one date; obtaining a plurality of flight search results that satisfy the flight search query; grouping the plurality of flight search results into groups based at least in part on calculated similarity between flight search results of the plurality of flight search results, wherein the similarity is based on a combination of a plurality of features of the respective flight itineraries of the flight search results; and providing the plurality of flight search results for display in a flight search results interface including providing at least one group result of the generated groups of flight search results.Type: ApplicationFiled: October 31, 2014Publication date: May 5, 2016Inventors: Ralf Metzger, Jiri Semecky, Peter Werner Balsiger, Piotr Maciej Buczek, Ally Gale, Lukas Platinsky