Patents by Inventor Simon Lynen
Simon Lynen has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20230360242Abstract: An electronic device tracks its motion in an environment while building a three-dimensional visual representation of the environment that is used to correct drift in the tracked motion. A motion tracking module estimates poses of the electronic device based on feature descriptors corresponding to the visual appearance of spatial features of objects in the environment. A mapping module builds a three-dimensional visual representation of the environment based on a stored plurality of maps, and feature descriptors and estimated device poses received from the motion tracking module. The mapping module provides the three-dimensional visual representation of the environment to a localization module, which identifies correspondences between stored and observed feature descriptors. The localization module performs a loop closure by minimizing the discrepancies between matching feature descriptors to compute a localized pose.Type: ApplicationFiled: July 20, 2023Publication date: November 9, 2023Inventors: Esha Nerurkar, Simon Lynen, Sheng Zhao
-
Patent number: 11734846Abstract: An electronic device tracks its motion in an environment while building a three-dimensional visual representation of the environment that is used to correct drift in the tracked motion. A motion tracking module estimates poses of the electronic device based on feature descriptors corresponding to the visual appearance of spatial features of objects in the environment. A mapping module builds a three-dimensional visual representation of the environment based on a stored plurality of maps, and feature descriptors and estimated device poses received from the motion tracking module. The mapping module provides the three-dimensional visual representation of the environment to a localization module, which identifies correspondences between stored and observed feature descriptors. The localization module performs a loop closure by minimizing the discrepancies between matching feature descriptors to compute a localized pose.Type: GrantFiled: May 15, 2020Date of Patent: August 22, 2023Assignee: GOOGLE LLCInventors: Esha Nerurkar, Simon Lynen, Sheng Zhao
-
Patent number: 10937214Abstract: An electronic device merges a plurality of maps, or area description files (ADFs), by representing relationships among ADFs in an undirected graph, with vertices representing maps and edges representing transformations between maps. As the electronic device generates new ADFs, the electronic device merges each new ADF to a stored collection of ADFs by adding each new ADF as a vertex and transformations between the new ADF and the collection of ADFs as edges in the undirected graph. In this way, the map merger can use the undirected graph to more accurately represent the relations between any two maps, allowing more efficient merger of new maps to a previously stored collection of maps, and allowing for the development of more flexible and efficient algorithms for manipulating the merged maps.Type: GrantFiled: March 22, 2017Date of Patent: March 2, 2021Assignee: GOOGLE LLCInventors: Esha Nerurkar, Simon Lynen, Dongfang Zheng
-
Patent number: 10937249Abstract: The present disclosure provides systems and methods that allow a first user to anchor a virtual object to a physical location and a second user to view the virtual object at the physical location. In particular, aspects of the present disclosure are directed to workflows, application programming interfaces (“APIs”), and computing system infrastructure that enable a first computing device to generate an ad hoc mapping of its physical surroundings (e.g., based on sensor data collected by the first device) and to position and anchor a virtual object at a physical location within such mapping. Data descriptive of the mapping and virtual object can be stored in a cloud database and can then be used by or for a second device to render the virtual object at the location of the anchor.Type: GrantFiled: May 7, 2019Date of Patent: March 2, 2021Assignee: Google LLCInventors: Simon Lynen, Eric Thomas Andresen
-
Patent number: 10802147Abstract: An electronic device tracks its motion in an environment while building a three-dimensional visual representation of the environment that is used to correct drift in the tracked motion. A motion tracking module estimates poses of the electronic device based on feature descriptors corresponding to the visual appearance of spatial features of objects in the environment. A mapping module builds a three-dimensional visual representation of the environment based on a stored plurality of maps, and feature descriptors and estimated device poses received from the motion tracking module. The mapping module provides the three-dimensional visual representation of the environment to a localization module, which identifies correspondences between stored and observed feature descriptors. The localization module performs a loop closure by minimizing the discrepancies between matching feature descriptors to compute a localized pose.Type: GrantFiled: May 15, 2017Date of Patent: October 13, 2020Assignee: GOOGLE LLCInventors: Esha Nerurkar, Simon Lynen, Sheng Zhao
-
Publication number: 20200278449Abstract: An electronic device tracks its motion in an environment while building a three-dimensional visual representation of the environment that is used to correct drift in the tracked motion. A motion tracking module estimates poses of the electronic device based on feature descriptors corresponding to the visual appearance of spatial features of objects in the environment. A mapping module builds a three-dimensional visual representation of the environment based on a stored plurality of maps, and feature descriptors and estimated device poses received from the motion tracking module. The mapping module provides the three-dimensional visual representation of the environment to a localization module, which identifies correspondences between stored and observed feature descriptors. The localization module performs a loop closure by minimizing the discrepancies between matching feature descriptors to compute a localized pose.Type: ApplicationFiled: May 15, 2020Publication date: September 3, 2020Inventors: Esha Nerurkar, Simon Lynen, Sheng Zhao
-
Publication number: 20190340836Abstract: The present disclosure provides systems and methods that allow a first user to anchor a virtual object to a physical location and a second user to view the virtual object at the physical location. In particular, aspects of the present disclosure are directed to workflows, application programming interfaces (“APIs”), and computing system infrastructure that enable a first computing device to generate an ad hoc mapping of its physical surroundings (e.g., based on sensor data collected by the first device) and to position and anchor a virtual object at a physical location within such mapping. Data descriptive of the mapping and virtual object can be stored in a cloud database and can then be used by or for a second device to render the virtual object at the location of the anchor.Type: ApplicationFiled: May 7, 2019Publication date: November 7, 2019Inventors: Simon Lynen, Eric Thomas Andresen
-
Patent number: 10339708Abstract: An electronic device generates a summary map of a scene based on data representative of objects having a high utility for identifying the scene when estimating a current pose of the electronic device and localizes the estimated current pose with respect to the summary map. The electronic device identifies scenes based on groups of objects appearing together in consistent configurations over time, and identifies utility weights for objects appearing in scenes, wherein the utility weights indicate a predicted likelihood that the corresponding object will be persistently identifiable by the electronic device in the environment over time and are based at least in part on verification by one or more mobile devices. The electronic device generates a summary map of each scene based on data representative of objects having utility weights above a threshold.Type: GrantFiled: July 25, 2017Date of Patent: July 2, 2019Inventors: Simon Lynen, Bernhard Zeisl
-
Publication number: 20180276863Abstract: An electronic device merges a plurality of maps, or area description files (ADFs), by representing relationships among ADFs in an undirected graph, with vertices representing maps and edges representing transformations between maps. As the electronic device generates new ADFs, the electronic device merges each new ADF to a stored collection of ADFs by adding each new ADF as a vertex and transformations between the new ADF and the collection of ADFs as edges in the undirected graph. In this way, the map merger can use the undirected graph to more accurately represent the relations between any two maps, allowing more efficient merger of new maps to a previously stored collection of maps, and allowing for the development of more flexible and efficient algorithms for manipulating the merged maps.Type: ApplicationFiled: March 22, 2017Publication date: September 27, 2018Inventors: Esha Nerurkar, Simon Lynen, Dongfang Zheng
-
Publication number: 20180122136Abstract: An electronic device generates a summary map of a scene based on data representative of objects having a high utility for identifying the scene when estimating a current pose of the electronic device and localizes the estimated current pose with respect to the summary map. The electronic device identifies scenes based on groups of objects appearing together in consistent configurations over time, and identifies utility weights for objects appearing in scenes, wherein the utility weights indicate a predicted likelihood that the corresponding object will be persistently identifiable by the electronic device in the environment over time and are based at least in part on verification by one or more mobile devices. The electronic device generates a summary map of each scene based on data representative of objects having utility weights above a threshold.Type: ApplicationFiled: July 25, 2017Publication date: May 3, 2018Inventors: Simon Lynen, Bernhard Zeisl
-
Patent number: 9940542Abstract: An electronic device reduces localization data based on feature characteristics identified from the data. Based on the feature characteristics, a quality value can be assigned to each identified feature, indicating the likelihood that the data associated with the feature will be useful in mapping a local environment of the electronic device. The localization data is reduced by removing data associated with features have a low quality value, and the reduced localization data is used to map the local environment of the device by locating features identified from the reduced localization data in a frame of reference for the electronic device.Type: GrantFiled: August 11, 2015Date of Patent: April 10, 2018Assignee: Google LLCInventors: Esha Nerurkar, Joel Hesch, Simon Lynen
-
Publication number: 20170336511Abstract: An electronic device tracks its motion in an environment while building a three-dimensional visual representation of the environment that is used to correct drift in the tracked motion. A motion tracking module estimates poses of the electronic device based on feature descriptors corresponding to the visual appearance of spatial features of objects in the environment. A mapping module builds a three-dimensional visual representation of the environment based on a stored plurality of maps, and feature descriptors and estimated device poses received from the motion tracking module. The mapping module provides the three-dimensional visual representation of the environment to a localization module, which identifies correspondences between stored and observed feature descriptors. The localization module performs a loop closure by minimizing the discrepancies between matching feature descriptors to compute a localized pose.Type: ApplicationFiled: May 15, 2017Publication date: November 23, 2017Inventors: Esha Nerurkar, Simon Lynen, Sheng Zhao
-
Patent number: 9811734Abstract: A computing system includes a network interface, a first datastore, a second datastore, and a merge module. The merge module is to receive a set of one or more area description files from a set of one or more first mobile devices. Each area description file represents a point cloud of spatial features detected by a corresponding first mobile device at an area. The computing system further includes a localization module and a query module. The localization generation module is to generate a localization area description file for the area from the set of one or more area description files and to store the localization area description file in the second datastore. The localization area description file represents a point cloud of spatial features for the area. The query module is to provide the localization area description file to a second mobile device via the network interface.Type: GrantFiled: May 11, 2015Date of Patent: November 7, 2017Assignee: Google Inc.Inventors: Brian Patrick Williams, Ryan Michael Hickman, Laurent Tu, Esha Nerurkar, Simon Lynen
-
Publication number: 20170046594Abstract: An electronic device reduces localization data based on feature characteristics identified from the data. Based on the feature characteristics, a quality value can be assigned to each identified feature, indicating the likelihood that the data associated with the feature will be useful in mapping a local environment of the electronic device. The localization data is reduced by removing data associated with features have a low quality value, and the reduced localization data is used to map the local environment of the device by locating features identified from the reduced localization data in a frame of reference for the electronic device.Type: ApplicationFiled: August 11, 2015Publication date: February 16, 2017Inventors: Esha Nerurkar, Joel Hesch, Simon Lynen
-
Publication number: 20160335497Abstract: A computing system includes a network interface, a first datastore, a second datastore, and a merge module. The merge module is to receive a set of one or more area description files from a set of one or more first mobile devices. Each area description file represents a point cloud of spatial features detected by a corresponding first mobile device at an area. The computing system further includes a localization module and a query module. The localization generation module is to generate a localization area description file for the area from the set of one or more area description files and to store the localization area description file in the second datastore. The localization area description file represents a point cloud of spatial features for the area. The query module is to provide the localization area description file to a second mobile device via the network interface.Type: ApplicationFiled: May 11, 2015Publication date: November 17, 2016Inventors: Brian Patrick Williams, Ryan Michael Hickman, Laurent Tu, Esha Nerurkar, Simon Lynen
-
Publication number: 20160335275Abstract: A computing system includes a datastore, a network interface, and a query module. The datastore stores a plurality of localization area description files. The network interface is to receive a request for a localization area description file from a mobile device, the request comprising a set of spatial features and at least one non-image location indicator. The query module includes a query interface to identify one or more candidate localization area description files based on one of the set of spatial features of the request and the at least one location indicator of the request, and includes a selection module to select a localization area description file from the candidate localization area description files based on the other of the set of spatial features of the request and the at least one location indicator. The query module is to provide the selected localization area description file to the mobile device.Type: ApplicationFiled: May 11, 2015Publication date: November 17, 2016Inventors: Brian Patrick Williams, Laurent Tu, Esha Nerurkar, Simon Lynen
-
Patent number: 9303999Abstract: Methods and systems for determining estimation of motion of a device are provided. An example method includes receiving data from an inertial measurement unit (IMU) of a device and receiving images from a camera of the device for a sliding time window. The method also includes determining an IMU estimation of motion of the device based on the data from the IMU, and a camera estimation of motion of the device based on feature tracking in the images. The method includes, based on the IMU estimation and the camera estimation having a difference more than a threshold amount, determining one or more of a position or a velocity of the device for the sliding time window, and determining an overall estimation of motion of the device as supported by the data from the IMU and the position or velocity of the device.Type: GrantFiled: December 30, 2013Date of Patent: April 5, 2016Assignee: Google Technology Holdings LLCInventors: Joel Hesch, Johnny Lee, Simon Lynen
-
Publication number: 20150193971Abstract: Methods and systems for map generation are described. A computing device may receive outputs from a plurality of sensors at a position of the device in an environment, which may include data corresponding to visual features of the environment at the first position. Based on correspondence in the outputs from the plurality of sensors, the computing device may generate a map of the environment comprising sparse mapping data, and the sparse mapping data comprises the data corresponding to the visual features. The device may receive additional outputs at other positions of the device in the environment and may modify the map based on the additional outputs. In addition, the device may modify the map based on receiving dense mapping information from sensors, which may include data corresponding to objects in the environment in a manner such that represents a structure of the object in the environment.Type: ApplicationFiled: January 3, 2014Publication date: July 9, 2015Applicant: Motorola Mobility LLCInventors: Ivan Dryanovski, Simon Lynen, Joel Hesch
-
Publication number: 20150185018Abstract: Methods and systems for determining estimation of motion of a device are provided. An example method includes receiving data from an inertial measurement unit (IMU) of a device and receiving images from a camera of the device for a sliding time window. The method also includes determining an IMU estimation of motion of the device based on the data from the IMU, and a camera estimation of motion of the device based on feature tracking in the images. The method includes, based on the IMU estimation and the camera estimation having a difference more than a threshold amount, determining one or more of a position or a velocity of the device for the sliding time window, and determining an overall estimation of motion of the device as supported by the data from the IMU and the position or velocity of the device.Type: ApplicationFiled: December 30, 2013Publication date: July 2, 2015Applicant: Motorola Mobility LLCInventors: Joel Hesch, Johnny Lee, Simon Lynen