Patents by Inventor Greg Coombe
Greg Coombe has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10527417Abstract: A high-definition map system receives sensor data from vehicles travelling along routes and combines the data to generate a high definition map for use in driving vehicles, for example, for guiding autonomous vehicles. A pose graph is built from the collected data, each pose representing location and orientation of a vehicle. The pose graph is optimized to minimize constraints between poses. Points associated with surface are assigned a confidence measure determined using a measure of hardness/softness of the surface. A machine-learning-based result filter detects bad alignment results and prevents them from being entered in the subsequent global pose optimization. The alignment framework is parallelizable for execution using a parallel/distributed architecture. Alignment hot spots are detected for further verification and improvement. The system supports incremental updates, thereby allowing refinements of subgraphs for incrementally improving the high-definition map for keeping it up to date.Type: GrantFiled: December 28, 2017Date of Patent: January 7, 2020Assignee: DeepMap, Inc.Inventors: Chen Chen, Greg Coombe
-
Publication number: 20190250283Abstract: Systems, computer-implemented methods, apparatus and/or computer program products are provided that facilitate improving the accuracy of global positioning system (GPS) coordinates of indoor photos. The disclosed subject matter further provides systems, computer-implemented methods, apparatus and/or computer program products that facilitate generating exterior photos of structures based on GPS coordinates of indoor photos.Type: ApplicationFiled: February 8, 2019Publication date: August 15, 2019Inventors: Dorra Larnaout, Greg Coombe
-
Publication number: 20190219700Abstract: A system align point clouds obtained by sensors of a vehicle using kinematic iterative closest point with integrated motions estimates. The system receives lidar scans from a lidar mounted on the vehicle. The system derives point clouds from the lidar scan data. The system iteratively determines velocity parameters that minimize an aggregate measure of distance between corresponding points of the plurality of pairs of points. The system iteratively improves the velocity parameters. The system uses the velocity parameters for various purposes including for building high definition maps used for navigating the vehicle.Type: ApplicationFiled: November 16, 2018Publication date: July 18, 2019Inventors: Greg Coombe, Chen Chen, Derik Schroeter, Jeffrey Minoru Adachi, Mark Damon Wheeler
-
Patent number: 10267634Abstract: A high-definition map system receives sensor data from vehicles travelling along routes and combines the data to generate a high definition map for use in driving vehicles, for example, for guiding autonomous vehicles. A pose graph is built from the collected data, each pose representing location and orientation of a vehicle. The pose graph is optimized to minimize constraints between poses. Points associated with surface are assigned a confidence measure determined using a measure of hardness/softness of the surface. A machine-learning-based result filter detects bad alignment results and prevents them from being entered in the subsequent global pose optimization. The alignment framework is parallelizable for execution using a parallel/distributed architecture. Alignment hot spots are detected for further verification and improvement. The system supports incremental updates, thereby allowing refinements of subgraphs for incrementally improving the high-definition map for keeping it up to date.Type: GrantFiled: December 28, 2017Date of Patent: April 23, 2019Assignee: DeepMap Inc.Inventors: Chen Chen, Greg Coombe, Derik Schroeter
-
Patent number: 10140765Abstract: A staged camera traversal for navigating a virtual camera in a three dimensional environment is provided. The staged camera traversal can include a launch stage and an approach stage. During the launch stage, the tilt angle can be decreased towards zero tilt (i.e. straight down) with respect to the vertical. During an approach stage, the tilt angle of the virtual camera can be increased from about zero tilt towards the tilt angle associated with a target location. In certain implementations, the staged camera traversal can further include a traversal stage occurring between the launch stage and the approach stage. The tilt angle of the virtual camera can be maintained at about zero tilt during the traversal stage. The approach path of the virtual camera can be aligned along a view direction associated with the target destination during the approach stage.Type: GrantFiled: February 25, 2013Date of Patent: November 27, 2018Assignee: Google LLCInventors: Greg Coombe, Francois Bailly
-
Publication number: 20180188043Abstract: A high-definition map system receives sensor data from vehicles travelling along routes and combines the data to generate a high definition map for use in driving vehicles, for example, for guiding autonomous vehicles. A pose graph is built from the collected data, each pose representing location and orientation of a vehicle. The pose graph is optimized to minimize constraints between poses. Points associated with surface are assigned a confidence measure determined using a measure of hardness/softness of the surface. A machine-learning-based result filter detects bad alignment results and prevents them from being entered in the subsequent global pose optimization. The alignment framework is parallelizable for execution using a parallel/distributed architecture. Alignment hot spots are detected for further verification and improvement.Type: ApplicationFiled: December 28, 2017Publication date: July 5, 2018Inventors: Chen Chen, Greg Coombe
-
Publication number: 20180188040Abstract: A high-definition map system receives sensor data from vehicles travelling along routes and combines the data to generate a high definition map for use in driving vehicles, for example, for guiding autonomous vehicles. A pose graph is built from the collected data, each pose representing location and orientation of a vehicle. The pose graph is optimized to minimize constraints between poses. Points associated with surface are assigned a confidence measure determined using a measure of hardness/softness of the surface. A machine-learning-based result filter detects bad alignment results and prevents them from being entered in the subsequent global pose optimization. The alignment framework is parallelizable for execution using a parallel/distributed architecture. Alignment hot spots are detected for further verification and improvement.Type: ApplicationFiled: December 28, 2017Publication date: July 5, 2018Inventors: Chen Chen, Greg Coombe, Derik Schroeter
-
Patent number: 9092900Abstract: Embodiments alter the swoop trajectory depending on the terrain within the view of the virtual camera. To swoop into a target, a virtual camera may be positioned at an angle relative to the upward normal vector from the target. That angle may be referred to as a tilt angle. According to embodiments, the tilt angle may increase more quickly in areas of high terrain variance (e.g., mountains or cities with tall buildings) than in areas with less terrain variance (e.g., flat plains). To determine the level of terrain variance in an area, embodiments may weigh terrain data having higher detail more heavily than terrain data having less detail.Type: GrantFiled: June 27, 2012Date of Patent: July 28, 2015Assignee: Google Inc.Inventor: Greg Coombe
-
Patent number: 9024947Abstract: The capability to render and navigate three-dimensional panoramic images in a virtual three-dimensional environment so as to create an immersive three-dimensional experience is provided. Such a capability can present a three-dimensional photographic experience of the real world that is seamlessly integrated with the virtual three-dimensional environment. Depth values associated with the panoramic images may be used to create three-dimensional geometry, which can be rendered as part of the virtual three-dimensional environment. Further, such a capability can enable a user to roam freely through the environment while providing a more natural free-form exploration of the environment than existing systems.Type: GrantFiled: January 30, 2014Date of Patent: May 5, 2015Assignee: Google Inc.Inventors: Greg Coombe, Daniel Barcay, Gokul Varadhan, Francois Bailly
-
Publication number: 20140240318Abstract: A staged camera traversal for navigating a virtual camera in a three dimensional environment is provided. The staged camera traversal can include a launch stage and an approach stage. During the launch stage, the tilt angle can be decreased towards zero tilt (i.e. straight down) with respect to the vertical. During an approach stage, the tilt angle of the virtual camera can be increased from about zero tilt towards the tilt angle associated with a target location. In certain implementations, the staged camera traversal can further include a traversal stage occurring between the launch stage and the approach stage. The tilt angle of the virtual camera can be maintained at about zero tilt during the traversal stage. The approach path of the virtual camera can be aligned along a view direction associated with the target destination during the approach stage.Type: ApplicationFiled: February 25, 2013Publication date: August 28, 2014Applicant: Google Inc.Inventors: Greg Coombe, Francois Bailly
-
Publication number: 20140146046Abstract: The capability to render and navigate three-dimensional panoramic images in a virtual three-dimensional environment so as to create an immersive three-dimensional experience is provided. Such a capability can present a three-dimensional photographic experience of the real world that is seamlessly integrated with the virtual three-dimensional environment. Depth values associated with the panoramic images may be used to create three-dimensional geometry, which can be rendered as part of the virtual three-dimensional environment. Further, such a capability can enable a user to roam freely through the environment while providing a more natural free-form exploration of the environment than existing systems.Type: ApplicationFiled: January 30, 2014Publication date: May 29, 2014Applicant: Google Inc.Inventors: Greg Coombe, Daniel Barcay, Gokul Varadhan, Francois Bailly
-
Publication number: 20140118495Abstract: Embodiments alter the swoop trajectory depending on the terrain within the view of the virtual camera. To swoop into a target, a virtual camera may be positioned at an angle relative to the upward normal vector from the target. That angle may be referred to as a tilt angle. According to embodiments, the tilt angle may increase more quickly in areas of high terrain variance (e.g., mountains or cities with tall buildings) than in areas with less terrain variance (e.g., flat plains). To determine the level of terrain variance in an area, embodiments may weigh terrain data having higher detail more heavily than terrain data having less detail.Type: ApplicationFiled: June 27, 2012Publication date: May 1, 2014Applicant: Google Inc.Inventor: Greg Coombe
-
Patent number: 8681151Abstract: The capability to render and navigate three-dimensional panoramic images in a virtual three-dimensional environment so as to create an immersive three-dimensional experience is provided. Such a capability can present a three-dimensional photographic experience of the real world that is seamlessly integrated with the virtual three-dimensional environment. Depth values associated with the panoramic images may be used to create three-dimensional geometry, which can be rendered as part of the virtual three-dimensional environment. Further, such a capability can enable a user to roam freely through the environment while providing a more natural free-form exploration of the environment than existing systems.Type: GrantFiled: November 22, 2011Date of Patent: March 25, 2014Assignee: Google Inc.Inventors: Greg Coombe, Daniel Barcay, Gokul Varadhan, Francois Bailly
-
Publication number: 20120299920Abstract: The capability to render and navigate three-dimensional panoramic images in a virtual three-dimensional environment so as to create an immersive three-dimensional experience is provided. Such a capability can present a three-dimensional photographic experience of the real world that is seamlessly integrated with the virtual three-dimensional environment. Depth values associated with the panoramic images may be used to create three-dimensional geometry, which can be rendered as part of the virtual three-dimensional environment. Further, such a capability can enable a user to roam freely through the environment while providing a more natural free-form exploration of the environment than existing systems.Type: ApplicationFiled: November 22, 2011Publication date: November 29, 2012Applicant: Google Inc.Inventors: Greg Coombe, Daniel Barcay, Gokul Varadhan, Francois Bailly