Intermediate waypoint generator
A method for generating intermediate waypoints for a navigation system of a robot includes receiving a navigation route. The navigation route includes a series of high-level waypoints that begin at a starting location and end at a destination location and is based on high-level navigation data. The high-level navigation data is representative of locations of static obstacles in an area the robot is to navigate. The method also includes receiving image data of an environment about the robot from an image sensor and generating at least one intermediate waypoint based on the image data. The method also includes adding the at least one intermediate waypoint to the series of high-level waypoints of the navigation route and navigating the robot from the starting location along the series of high-level waypoints and the at least one intermediate waypoint toward the destination location.
Latest Boston Dynamics, Inc. Patents:
This U.S. patent application is a continuation of, and claims priority under 35 U.S.C. § 120 from U.S. patent application Ser. No. 16/569,885, filed on Sep. 13, 2019, which claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Application 62/883,438, filed on Aug. 6, 2019. The disclosures of these prior applications are considered part of the disclosure of this application and are hereby incorporated by reference in their entireties.
TECHNICAL FIELDThis disclosure relates to generating intermediate waypoints in the presence of constraints, especially those imposed by terrain.
BACKGROUNDRobotic devices are increasingly being used to navigate constrained environments to perform a variety of tasks or functions. These robotic devices often need to navigate through these constrained environments without contacting the obstacles or becoming stuck or trapped. As these robotic devices become more prevalent, there is a need for real-time navigation and route planning that avoids contact with obstacles while successfully navigating to the destination.
SUMMARYOne aspect of the disclosure provides a method for generating intermediate waypoints for a navigation system of a robot. The method includes receiving, at data processing hardware of a robot, a navigation route. The navigation route includes a series of high-level waypoints that begin at a starting location and end at a destination location and are based on high-level navigation data. The high-level navigation data is representative of locations of static obstacles in an area the robot is to navigate. The method also includes receiving, at the data processing hardware, image data of an environment about the robot from an image sensor. The method also includes generating, by the data processing hardware, at least one intermediate waypoint based on the image data. The method also includes adding, by the data processing hardware, the at least one intermediate waypoint to the series of high-level waypoints of the navigation route and navigating, by the data processing hardware, the robot from the starting location along the series of high-level waypoints and the at least one intermediate waypoint toward the destination location.
Implementations of the disclosure may include one or more of the following optional features. In some implementations, each high-level waypoint and each intermediate waypoint include two coordinates indicating a position on a plane, a yaw value, and a time value. The time value indicates an estimated amount of time for the robot to navigate to the respective waypoint. In some examples, the method further includes maintaining, by the data processing hardware, each of the series of high-level waypoints on the navigation route. The method may also include receiving, at the data processing hardware, a body obstacle map. The body obstacle map includes a map of obstacles that a body of the robot cannot traverse. Additionally, generating the at least one intermediate waypoint may include generating a sparse graph based on the body obstacle map. The sparse graph includes a list of nodes and edges. The nodes and edges are representative of paths the robot may travel in the environment. The method may also include planning a coarse path from a first node from the list of nodes to a second node from the list of nodes. The first node and the second node are each representative of a space in the environment. The method may also include determining a point along the coarse path where line of sight by the image sensor is lost and generating one of the at least one intermediate waypoint at the point where line of sight is lost.
In some implementations, generating the sparse graph includes generating a full configuration space map based on the body obstacle map. The full configuration space map includes a two-dimensional grid of elements, where each element of the grid represents a space of the environment and each element of the full configuration space map includes a respective set of yaw configurations. Each yaw configuration may be classified as valid or invalid, where a valid yaw configuration represents a yaw configuration of the robot that is safe from contacting obstacles at the space associated with the respective element and an invalid yaw configuration represents a yaw configuration of the robot that is not safe from contacting obstacles at the space associated with the respective element. The method may also include generating a compressed configuration space map from the full configuration space map. The compressed configuration space map includes a second two-dimensional grid of elements and each element of the second grid represents a space of the environment. Each element of the second grid is also categorized as one of (i) a yaw collision zone, (ii) a yaw free zone, or (iii) a yaw constrained zone. The method, in some examples, includes generating the sparse graph from the compressed configuration space map.
Optionally, planning the coarse path from the first node to the second node includes generating a dense graph from the compressed configuration space map. The dense graph includes elements categorized as yaw free zone. The method may also include linking edges from the sparse graph with elements from the dense graph. Linking the edges with elements from the dense graph, in some implementations, includes combining the sparse graph and the dense graph to generate a final graph. In some examples, the method also includes executing an A* search algorithm on the final graph.
In some examples, generating the sparse graph further includes overlaying a plurality of Voronoi cells onto the compressed configuration space map, where each Voronoi cell is categorized as yaw constrained zone on the sparse graph. Each Voronoi cell is equidistant from at least two elements categorized as yaw collision zone. Optionally, generating the sparse graph further includes classifying each Voronoi cell as either an edge or a node. Classifying each Voronoi cell may include executing a flood fill algorithm.
In some implementations, planning the coarse path from the first node to the second node includes pruning edges, and each pruned edge includes elements under a threshold length and categorized as either yaw collision zone or yaw constrained zone. Determining the point along the coarse path where line of sight by the image sensor is lost may include determining a minimum allowable yaw and a maximum allowable yaw at each element along the planned coarse path and determining a smallest envelope based on the minimum allowable yaw and the maximum allowable yaw. The method may also include determining that a required yaw at a point on the coarse path is outside the smallest envelope.
In some examples, the method further includes sending, by the data processing hardware, the navigation route with the high-level waypoints and the at least one intermediate waypoint to a low-level path generator. The method may also include determining, by the data processing hardware, whether to add the at least one intermediate waypoint to the navigation route and, in response to determining not to add at the at least one intermediate waypoint to the navigation route, passing, by the data processing hardware, the navigation route unaltered to the low-level path generator.
Another aspect of the disclosure provides a robot that includes a body and legs coupled to the body and configured to maneuver the robot about an environment. The robot also includes data processing hardware in communication with the legs and memory hardware in communication with the data processing hardware. The memory hardware stores instructions that when executed on the data processing hardware cause the data processing hardware to perform operations. The operations include receiving a navigation route. The navigation route includes a series of high-level waypoints that begin at a starting location and end at a destination location and are based on high-level navigation data. The high-level navigation data is representative of locations of static obstacles in an area the robot is to navigate. The operations also include receiving image data of an environment about the robot from an image sensor. The operations also include generating at least one intermediate waypoint based on the image data. The operations also include adding the at least one intermediate waypoint to the series of high-level waypoints of the navigation route and navigating the robot from the starting location along the series of high-level waypoints and the at least one intermediate waypoint toward the destination location.
This aspect may include one or more of the following optional features. In some implementations, each high-level waypoint and each intermediate waypoint include two coordinates indicating a position on a plane, a yaw value, and a time value. The time value indicates an estimated amount of time for the robot to navigate to the respective waypoint. In some examples, the operations further include maintaining each of the series of high-level waypoints on the navigation route. The operations may also include receiving a body obstacle map. The body obstacle map includes a map of obstacles that a body of the robot cannot traverse. Additionally, generating the at least one intermediate waypoint may include generating a sparse graph based on the body obstacle map. The sparse graph includes a list of nodes and edges. The nodes and edges are representative of paths the robot may travel in the environment. The operations may also include planning a coarse path from a first node from the list of nodes to a second node from the list of nodes. The first node and the second node are each representative of a space in the environment. The operations may also include determining a point along the coarse path where line of sight by the image sensor is lost and generating one of the at least one intermediate waypoint at the point where line of sight is lost.
In some implementations, generating the sparse graph includes generating a full configuration space map based on the body obstacle map. The full configuration space map includes a two-dimensional grid of elements, where each element of the grid represents a space of the environment and each element of the full configuration space map includes a respective set of yaw configurations. Each yaw configuration may be classified as valid or invalid, where a valid yaw configuration represents a yaw configuration of the robot that is safe from contacting obstacles at the space associated with the respective element and an invalid yaw configuration represents a yaw configuration of the robot that is not safe from contacting obstacles at the space associated with the respective element. The operations may also include generating a compressed configuration space map from the full configuration space map. The compressed configuration space map includes a second two-dimensional grid of elements and each element of the second grid represents a space of the environment. Each element of the second grid is also categorized as one of (i) a yaw collision zone, (ii) a yaw free zone, or (iii) a yaw constrained zone. The operations, in some examples, include generating the sparse graph from the compressed configuration space map.
Optionally, planning the coarse path from the first node to the second node includes generating a dense graph from the compressed configuration space map. The dense graph includes elements categorized as yaw free zone. The operations may also include linking edges from the sparse graph with elements from the dense graph. Linking the edges with elements from the dense graph, in some implementations, includes combining the sparse graph and the dense graph to generate a final graph. In some examples, the operations also include executing an A* search algorithm on the final graph.
In some examples, generating the sparse graph further includes overlaying a plurality of Voronoi cells onto the compressed configuration space map, where each Voronoi cell is categorized as yaw constrained zone on the sparse graph. Each Voronoi cell is equidistant from at least two elements categorized as yaw collision zone. Optionally, generating the sparse graph further includes classifying each Voronoi cell as either an edge or a node. Classifying each Voronoi cell may include executing a flood fill algorithm.
In some implementations, planning the coarse path from the first node to the second node includes pruning edges, and each pruned edge includes elements under a threshold length and categorized as either yaw collision zone or yaw constrained zone. Determining the point along the coarse path where line of sight by the image sensor is lost may include determining a minimum allowable yaw and a maximum allowable yaw at each element along the planned coarse path and determining a smallest envelope based on the minimum allowable yaw and the maximum allowable yaw. The operations may also include determining that a required yaw at a point on the coarse path is outside the smallest envelope.
In some examples, the operations further include sending the navigation route with the high-level waypoints and the at least one intermediate waypoint to a low-level path generator. The operations may also include determining whether to add the at least one intermediate waypoint to the navigation route and, in response to determining not to add at the at least one intermediate waypoint to the navigation route, passing the navigation route unaltered to the low-level path generator.
The details of one or more implementations of the disclosure are set forth in the accompanying drawings and the description below. Other aspects, features, and advantages will be apparent from the description and drawings, and from the claims.
Like reference symbols in the various drawings indicate like elements.
DETAILED DESCRIPTIONAs legged robotic devices (also referred to as “robots”) become more prevalent, there is an increasing need for the robots to autonomously navigate environments that are constrained in a number of ways. Often, the robots rely on high-level map data that stores information relating to large and/or static objects (e.g., walls, doors, etc.). When given a destination goal, the robot will often first plot a path or course from this high-level map to navigate the static obstacles and then rely on a perception system that gathers low-level navigation data to navigate around small and dynamic objects encountered while travelling. Frequently, when navigating these sort of environments, the robot encounters an obstacle that is not accounted for in the high-level map, but is too large for the low-level system to navigate around. In these situations, it is not uncommon for the robot to become stuck or trapped. Implementations herein are directed toward systems and methods for an intermediate waypoint generator for generating intermediate waypoints to a high-level route in real-time, thus allowing a legged robotic device to efficiently navigate around larger dynamic objects.
Referring to
In some implementations, the robot 10 further includes one or more appendages, such as an articulated arm 20 disposed on the body 11 and configured to move relative to the body 11. The articulated arm 20 may have five-degrees or more of freedom. Moreover, the articulated arm 20 may be interchangeably referred to as a manipulator arm or simply an appendage. In the example shown, the articulated arm 20 includes two portions 22, 24 rotatable relative to one another and also the body 11; however, the articulated arm 20 may include more or less portions without departing from the scope of the present disclosure. The first portion 22 may be separated from second portion 24 by an articulated arm joint 26. An end effector 28, which may be interchangeably referred to as a manipulator head 28, may be coupled to a distal end of the second portion 24 of the articulated arm 20 and may include one or more actuators 29 for gripping/grasping objects.
The robot 10 also includes a vision system 30 with at least one imaging sensor or camera 31, each sensor or camera 31 capturing image data or sensor data 17 of the environment 8 surrounding the robot 10 with an angle of view 32 and within a field of view 34. The vision system 30 may be configured to move the field of view 34 by adjusting the angle of view 32 or by panning and/or tilting (either independently or via the robot 10) the camera 31 to move the field of view 34 in any direction. Alternatively, the vision system 30 may include multiple sensors or cameras 31 such that the vision system 30 captures a generally 360-degree field of view around the robot 10.
The camera(s) 31 of the vision system 30, in some implementations, include one or more stereo cameras (e.g., one or more RGBD stereo cameras). In other examples, the vision system 30 includes one or more radar sensors such as a scanning light-detection and ranging (LIDAR) sensor, or a scanning laser-detection and ranging (LADAR) sensor, a light scanner, a time-of-flight sensor, or any other three-dimensional (3D) volumetric image sensor (or any such combination of sensors). In some implementations, the navigation system 100 includes a perception system 102 that receives and processes the sensor data 17 and passes the processed sensor data 17 to the intermediate waypoint generator 120 and low-level path generator 130.
The vision system 30 provides image data or sensor data 17 derived from image data captured by the cameras or sensors 31 to data processing hardware 36 of the robot 10. The data processing hardware 36 is in digital communication with memory hardware 38 and, in some implementations, may be a remote system. The remote system may be a single computer, multiple computers, or a distributed system (e.g., a cloud environment) having scalable/elastic computing resources and/or storage resources. A navigation system 100 of the robot 10 executes on the data processing hardware 36.
In the example shown, the navigation system 100 includes a high-level waypoint generator 110 that receives map data 50 (i.e., high-level navigation data representative of locations of static obstacles in an area the robot 10 is to navigate) and generates one or more high-level waypoints 210 (
In some implementations, at least a portion of the navigation system 100 executes on a remote device in communication with the robot 10. For instance, the high-level waypoint generator 110 may execute on a remote device to generate one or more of the high-level waypoints 210 and the intermediate waypoint generator 120 executing on the robot 10 may receive the high-level waypoints 210 from the remote device. Optionally, the entire navigation system 100 may execute on a remote device and the remote device may control/instruct the robot 10 to maneuver the environment 8 based the step plan 142.
Referring now to
Referring now to
Referring now to
The sparse graph generator 410 passes the sparse graph 416S to the coarse path planner 420. The coarse path planner 420 plots a coarse path 1310 (
In some implementations, the sparse graph generator 410 initially generates a full configuration space map 414 based on the received body obstacle map 412. In some examples, the full configuration space map 414 includes a two-dimensional (2D) grid or array of elements 510, 510a-n, where each element 510 represents an area of the environment 8 (e.g., a three centimeter by three centimeter square) that indicates whether or not an obstacle prevents the robot 10 from entering the particular element 510. Optionally, the full configuration space map 414 is an array of 128 elements by 128 elements. In some examples, each element 510 may be associated with a number of respective yaw configurations 512, 512a-n (respective set of yaw configuration 512, 512a-n) and the full configuration space map 414 may indicate a status for each of the respective yaw configurations 512 for each element 510. For example, each element 510 may be associated with 64 respective yaw configurations. In this example, each yaw configuration would be rotated 5.625-degrees from the previous configuration, as 360-degrees (i.e., a full circle) divided by 64 (i.e., the number of yaw configurations) is equal to 5.625 degrees. The full configuration space map 414 may indicate, for each respective yaw configuration 512 associated with a respective element 510, whether or not the respective yaw configuration 512 will result in a collision with a body obstacle if a select point of the robot 10 occupies the space represented by the element 510. That is, the full configuration space map 414 may indicate if each yaw configuration 512 is valid (i.e., the robot 10, in that yaw configuration 512 would not collide with an obstacle) or is invalid (i.e., the robot 10, in that yaw configuration 512, would collide with an obstacle). The select point may, for example, include a center point of the body 11 of the robot 10 (e.g., equidistance from the top of each leg 12).
In some implementations, the sparse graph generator 410 generates a compressed configuration space map 500 from the full configuration space map 414. The compressed configuration space map 500 may categorize each element 510 of the configuration space map 500 instead of associating each element with a number of yaw configurations 512. For example, each element 510 may be categorized as a body obstacle element 510A (
Referring now to
The yaw free elements 510C are the opposite of the yaw collision elements 510B in that no yaw configurations 512 will result in a collision with a body obstacle. Put another way, a yaw free element 510C represents an area in the environment 8 where the robot 10 may spin through all yaw configurations 512 (i.e., spin 360 degrees with a center of axis at the select point) without colliding with a body obstacle. The yaw constrained elements 510D are elements where some yaw configurations 512 are safe or valid (i.e., the robot 10 will not collide with an obstacle) and some yaw configurations 512 are not safe (i.e., invalid yaw configurations 512 where the robot 10, in those yaw configurations 512, will collide with an obstacle). For example, the robot 10 may be able to travel next to and parallel to a wall, but turning 90 degrees may cause a collision with a wall. In this example, some yaw configurations 512 (i.e., the yaw configurations 512 where the robot 10 is parallel with the wall) are safe while other (i.e., the yaw configurations 512 where the robot 10 is perpendicular with the wall) are not safe.
The sparse graph generator 410, in some examples, determines valid paths through yaw constrained zones 510DZ (i.e., a zone of yaw constrained elements 510D) of the compressed configuration space map 500. Referring now to
To determine the elements 510 to mark as Voronoi cells 610, the sparse graph generator 410 determines the elements 510 (of the configuration space map 500) that are equidistant from obstacles (i.e., body obstacle elements 510A) on the configuration space map 500. That is, each element 510 inherently has a distance and direction to a nearest obstacle boundary.
In the example shown, when the sparse graph generator 410 analyzes the element 510 labeled ‘A’, the sparse graph generator 410 determines that the nearest yaw collision element 510B (or body obstacle element 510A) to element ‘A’ is element 510a. The sparse graph generator 410 then determines the nearest yaw collision element 510B to the right-hand neighbor of the element (labeled ‘B’), which in this case is element 510b. The sparse graph generator 410 may then determine the distance from the bottom neighbor of the element (labeled ‘C’) and the nearest yaw collision element 510B (which, in this case, is again element 510a). The sparse graph generator 410 may determine if, in some examples, two out of the three nearest obstacle boundaries are within a threshold distance apart (i.e., two of the three of the distances between ‘A’ and ‘B’, ‘A’ and ‘C’, and ‘B’ and ‘C’). When an element has two obstacle boundaries the same distance apart, the sparse graph generator 410 may determine the distances between both elements.
In this example, the distance between the nearest obstacle boundaries of ‘A’ and ‘B’ is ten elements, the distance between the nearest obstacle boundaries of ‘A’ and ‘C’ is zero elements (i.e., they share the same nearest obstacle boundary) and the distance between the nearest obstacle boundaries of ‘B’ and ‘C’ is again ten elements. When the threshold distance is less than ten elements, element ‘A’ will be marked as a Voronoi cell 610.
As previously discussed, the sparse graph generator may 410 make a similar analysis for each element of the configuration space map 500. With continued reference to
Referring now to
After each element marked as a Voronoi cell 610 is assigned a distance, the sparse graph generator 410, in some examples, determines an end or end node 812 of each branch of Voronoi cells 610. The end node 812 of each branch will have locally the largest distance from the arbitrarily selected element 510ff. From the end node 812 of a branch, the sparse graph generator 410, in some examples, marks each element 510 while traveling back toward the selected element 510ff. Each time the sparse graph generator 410 encounters an element 510 already previously marked, the generator may attach a unique label to the combination to the element. For example, in
Referring now to
Referring now to
To finalize the SE2 map 1100, in some implementations, the sparse graph generator 410 determines an entry and exit point from the map 1100 for the robot 10. Referring now to
Once the new nodes 1220 have been placed, the sparse graph generator 410 determines if a starting position of the robot 10 (e.g., the current location of the robot) is connected the forward yaw configuration node 1220, the backward yaw configuration node 1220, or both. That is, the sparse graph generator 410 determines if the robot 10 is capable of connecting to the nodes 1220 with the respective yaw configuration. The sparse graph generator 410, in some examples, determines an angle 1222 between the original or true goal 1210 and the two yaw configurations of the edge 810 along the new node 1220 (i.e., forward and backward). In the example shown in
In some examples, the sparse graph generator 410 connects nodes 812 that indicate the robot 10 will collide with an obstacle. For example, the sparse graph generator 410 may always assume that the robot turns by rotating from a center point, which may make some turns impossible. However, in some situations, the low-level path generator 130 may still be able to navigate the path by turning on a pivot point that is not at the center of the robot 10, and in these situations, the sparse graph generator 410 may create edges 810 between nodes 812 that the generator 120 may otherwise determine the robot incapable of traversing.
Referring back to
To generate the path 1310, the coarse path planner 420 may implement a search algorithm (e.g., an A* search algorithm). The coarse path planner 420 may use the search algorithm to search a dense grid or graph 416D of the 2D yaw free zones 510CZ to plot the path 1310. The sparse graph generator 410 may also generate the dense graph 416D and combine it with the sparse graph 416S and send the combined graph to the coarse path planner 420. Alternatively, the coarse path planner may generate that dense graph 416D. The dense graph is a list of interconnected nodes of yaw free zones 510CZ determined from the compressed configuration space map 500. That is, like the sparse graph 416S includes a list of interconnected nodes 812 of elements 510 within yaw constrained zones 510DZ, the dense graph 416D includes a list of interconnected nodes of elements 510 within yaw free zones 510CZ. As the yaw free space has no need for orientation (i.e., yaw) information because the robot 10 may spin freely in yaw free zones 510CZ, the coarse path planner 420 may search the dense graph 416D (dense relative to the sparse graph 416S). The coarse path planner 420 may combine the sparse graph 416S and the dense graph 416D to form a final graph 416F. The final graph 416F includes all interconnected nodes within both the yaw free zones 510CZ and the yaw constrained zones 510DZ. The coarse path planner 420 may also use the search algorithm to search the final graph 416F for the Voronoi cells 610 in yaw constrained zones 510DZ with the included orientation information and the yaw free zones 510CZ. The coarse path planner 420 may link the yaw free zone 510CZ with the yaw constrained zone 510DZ and traverse the yaw constrained zone 510DZ with the coarse path 1310.
Because the dense graph 416D lacks orientation information (and thus the final graph 416F lacks orientation information in the yaw free zones 510CZ), the coarse path planner 420, in some implementations, determines whether any orientation (i.e., forward or backward) for any edges 810 should be reversed. Optionally, the coarse path planner 420 is biased to walk forward facing along an edge 810. The coarse path planner 420 may identify segments of the path 1310 that are capable of having the yaw orientation reversed (i.e., both forward and backward orientations are valid). The coarse path planner 420 may then calculate a cost (i.e., an amount of rotation required) of traversing the path 1310.
For example, referring now to the map 1100 of
Referring back to
Referring now to
Referring to
In some examples, the waypoint placer 430 also adds an intermediate waypoint 310 when moving from a yaw free zone 510CZ to a yaw constrained zone 510DZ if orientation changes by more than a threshold amount (e.g., 60 degrees). Optionally, the waypoint placer 430 may add an intermediate waypoint 310 at the end of the path 1310 when the orientation at the goal (i.e., a goal pose) is off the current map (i.e., too far from the robot's current location) and line-of-sight to the goal is lost. Additionally, the waypoint placer 430 may shift waypoints away from junctions in narrow spaces. For example, an intermediate waypoint 310 may be projected down the path 1310 until outside a threshold radius around the junction.
High-level waypoints 210 may include two coordinates indicating a position on a plane (i.e., x and y coordinates), a yaw value, and a time value. The time value may indicate an estimated amount of time for the robot to navigate to the respective waypoint 310. Along with the x, y, and yaw values, the waypoint placer 430 may similarly add an estimated time value to the intermediate waypoints 310. The estimated time value may be determined by taking the maximum amount of time from the starting location to the waypoint in x, y, and yaw using any speed limits received. The waypoint placer 430 may offset the time for each high-level waypoint 210 on the planned trajectory after the intermediate waypoint 310.
Referring back to
The method 1600 includes, at operation 1604, receiving, at the data processing hardware 36, image data 17 of an environment 8 about the robot 10 from an image sensor 31. At operation 1606, the method 1600 includes generating, by the data processing hardware 36, at least one intermediate waypoint 310 based on the image data 17 and, at operation 1608, adding, by the data processing hardware 36, the at least one intermediate waypoint 310 to the series of high-level waypoints 210 of the navigation route 112. At operation 1610, the method 1600 includes navigating, by the data processing hardware 36, the robot 10 from the starting location 113 along the series of high-level waypoints 210 and the at least one intermediate waypoint 310 toward the destination location 114.
The computing device 1700 includes a processor 1710 (e.g., data processing hardware 36), memory 1720 (e.g., memory hardware 38), a storage device 1730, a high-speed interface/controller 1740 connecting to the memory 1720 and high-speed expansion ports 1750, and a low speed interface/controller 1760 connecting to a low speed bus 1770 and a storage device 1730. Each of the components 1710, 1720, 1730, 1740, 1750, and 1760, are interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate. The processor 1710 (e.g., data processing hardware) can process instructions for execution within the computing device 1700, including instructions stored in the memory 1720 or on the storage device 1730 to display graphical information for a graphical user interface (GUI) on an external input/output device, such as display 1780 coupled to high speed interface 1740. In other implementations, multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory. Also, multiple computing devices 1700 may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system). The processor 1710 may execute the intermediate waypoint generator 120 of
The memory 1720 stores information non-transitorily within the computing device 1700. The memory 1720 may be a computer-readable medium, a volatile memory unit(s), or non-volatile memory unit(s). The non-transitory memory 1720 may be physical devices used to store programs (e.g., sequences of instructions) or data (e.g., program state information) on a temporary or permanent basis for use by the computing device 1700. Examples of non-volatile memory include, but are not limited to, flash memory and read-only memory (ROM)/programmable read-only memory (PROM)/erasable programmable read-only memory (EPROM)/electronically erasable programmable read-only memory (EEPROM) (e.g., typically used for firmware, such as boot programs). Examples of volatile memory include, but are not limited to, random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), phase change memory (PCM) as well as disks or tapes.
The storage device 1730 is capable of providing mass storage for the computing device 1700. In some implementations, the storage device 1730 is a computer-readable medium. In various different implementations, the storage device 1730 may be a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations. In additional implementations, a computer program product is tangibly embodied in an information carrier. The computer program product contains instructions that, when executed, perform one or more methods, such as those described above. The information carrier is a computer- or machine-readable medium, such as the memory 1720, the storage device 1730, or memory on processor 1710.
The high speed controller 1740 manages bandwidth-intensive operations for the computing device 1700, while the low speed controller 1760 manages lower bandwidth-intensive operations. Such allocation of duties is exemplary only. In some implementations, the high-speed controller 1740 is coupled to the memory 1720 and to the high-speed expansion ports 1750, which may accept various expansion cards (not shown). In some implementations, the low-speed controller 1760 is coupled to the storage device 1730 and a low-speed expansion port 1790. The low-speed expansion port 1790, which may include various communication ports (e.g., USB, Bluetooth, Ethernet, wireless Ethernet), may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
Various implementations of the systems and techniques described herein can be realized in digital electronic and/or optical circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software, software applications or code) include machine instructions for a programmable processor, and can be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms “machine-readable medium” and “computer-readable medium” refer to any computer program product, non-transitory computer readable medium, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term “machine-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor.
The processes and logic flows described in this specification can be performed by one or more programmable processors, also referred to as data processing hardware, executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit). Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Computer readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure. Accordingly, other implementations are within the scope of the following claims.
Claims
1. A computer-implemented method when executed by data processing hardware of a robot causes the data processing hardware to perform operations comprising:
- receiving image data of an environment of the robot from an image sensor;
- receiving a body obstacle map comprising a map of obstacles that a body of the robot cannot traverse;
- generating a sparse graph based on the body obstacle map, the sparse graph comprising a set of nodes and edges that begin at a starting location and end at a destination, the nodes and edges representative of paths the robot may travel in the environment;
- planning a coarse path from the starting location to the destination;
- determining a point along the coarse path between a first node of the set of nodes and a second node of the set of nodes where line of sight by the image sensor is lost;
- generating an intermediate waypoint at the point along the coarse path where line of sight by the image sensor is lost; and
- navigating the robot from the starting location along the coarse path and the intermediate waypoint toward the destination.
2. The method of claim 1, wherein planning the coarse path comprises determining a route of valid interconnections between respective nodes and edges of the set of nodes and edges.
3. The method of claim 1, wherein generating the sparse graph comprises:
- generating a full configuration space map based on the body obstacle map, the full configuration space map comprising a two-dimensional grid of elements, each element of the grid representing a space of the environment, and each element of the full configuration space map comprising a respective set of yaw configurations; and
- generating the sparse graph from the full configuration space map.
4. The method of claim 3, wherein:
- each yaw configuration is classified as valid or invalid;
- a valid yaw configuration represents a yaw configuration of the robot that is safe from contacting obstacles at the space associated with the respective element; and
- an invalid yaw configuration represents a yaw configuration of the robot that is not safe from contacting obstacles at the space associated with the respective element.
5. The method of claim 3, wherein generating the sparse graph comprises:
- generating, from the full configuration space map, a compressed configuration space map comprising a second two-dimensional grid of elements, each element of the second grid representing a space of the environment, and each element of the second grid categorized as one of a: yaw collision zone; a yaw free zone, or a yaw constrained zone; and
- generating the sparse graph from the compressed configuration space map.
6. The method of claim 5, wherein planning the coarse path from the starting location to the destination comprises:
- generating a dense graph from the compressed configuration space map, the dense graph comprising elements categorized as yaw free zone; and
- linking edges from the sparse graph with elements from the dense graph.
7. The method of claim 6, wherein linking the edges with elements from the dense graph comprises:
- combining the sparse graph and the dense graph to generate a final graph; and
- executing an A* search algorithm on the final graph.
8. The method of claim 1, wherein determining the point along the coarse path where line of sight by the image sensor is lost comprises:
- determining a minimum allowable yaw and a maximum allowable yaw at each element along the planned coarse path;
- determining a smallest envelope based on the minimum allowable yaw and the maximum allowable yaw; and
- determining that a required yaw at a point on the coarse path is outside the smallest envelope.
9. The method of claim 1, wherein the destination represents a location in the environment of the robot.
10. The method of claim 1, wherein the destination comprises a respective node of the set of nodes of the sparse graph.
11. A robot comprising:
- a body;
- legs coupled to the body and configured to maneuver the robot about an environment;
- data processing hardware in communication with the legs; and
- memory hardware in communication with the data processing hardware, the memory hardware storing instructions that when executed on the data processing hardware cause the data processing hardware to perform operations comprising: receiving image data of an environment of the robot from an image sensor; receiving a body obstacle map comprising a map of obstacles that the body of the robot cannot traverse; generating a sparse graph based on the body obstacle map, the sparse graph comprising a set of nodes and edges that begin at a starting location and end at a destination, the nodes and edges representative of paths the robot may travel in the environment; planning a coarse path from the starting location to the destination; determining a point along the coarse path between a first node of the set of nodes and a second node of the set of nodes where line of sight by the image sensor is lost; generating an intermediate waypoint at the point along the coarse path where line of sight by the image sensor is lost; and navigating the robot from the starting location along the coarse path and the intermediate waypoint toward the destination.
12. The robot of claim 11, wherein planning the coarse path comprises determining a route of valid interconnections between respective nodes and edges of the set of nodes and edges.
13. The robot of claim 11, wherein generating the sparse graph comprises:
- generating a full configuration space map based on the body obstacle map, the full configuration space map comprising a two-dimensional grid of elements, each element of the grid representing a space of the environment, and each element of the full configuration space map comprising a respective set of yaw configurations; and
- generating the sparse graph from the full configuration space map.
14. The robot of claim 13, wherein:
- each yaw configuration is classified as valid or invalid;
- a valid yaw configuration represents a yaw configuration of the robot that is safe from contacting obstacles at the space associated with the respective element; and
- an invalid yaw configuration represents a yaw configuration of the robot that is not safe from contacting obstacles at the space associated with the respective element.
15. The robot of claim 13, wherein generating the sparse graph comprises:
- generating, from the full configuration space map, a compressed configuration space map comprising a second two-dimensional grid of elements, each element of the second grid representing a space of the environment, and each element of the second grid categorized as one of a: yaw collision zone; a yaw free zone, or a yaw constrained zone; and
- generating the sparse graph from the compressed configuration space map.
16. The robot of claim 15, wherein planning the coarse path from the starting location to the destination comprises:
- generating a dense graph from the compressed configuration space map, the dense graph comprising elements categorized as yaw free zone; and
- linking edges from the sparse graph with elements from the dense graph.
17. The robot of claim 16, wherein linking the edges with elements from the dense graph comprises:
- combining the sparse graph and the dense graph to generate a final graph; and
- executing an A* search algorithm on the final graph.
18. The robot of claim 11, wherein determining the point along the coarse path where line of sight by the image sensor is lost comprises:
- determining a minimum allowable yaw and a maximum allowable yaw at each element along the planned coarse path;
- determining a smallest envelope based on the minimum allowable yaw and the maximum allowable yaw; and
- determining that a required yaw at a point on the coarse path is outside the smallest envelope.
19. The robot of claim 11, wherein the destination represents a location in the environment of the robot.
20. The robot of claim 11, wherein the destination comprises a respective node of the set of nodes of the sparse graph.
5111401 | May 5, 1992 | Everett, Jr. |
8346391 | January 1, 2013 | Anhalt |
8548734 | October 1, 2013 | Barbeau et al. |
8849494 | September 30, 2014 | Herbach et al. |
8930058 | January 6, 2015 | Quist et al. |
9352470 | May 31, 2016 | da Silva et al. |
9586316 | March 7, 2017 | Swilling |
9717387 | August 1, 2017 | Szatmary et al. |
9933781 | April 3, 2018 | Bando et al. |
9975245 | May 22, 2018 | Whitman |
10226870 | March 12, 2019 | Silva et al. |
11268816 | March 8, 2022 | Fay et al. |
20060167621 | July 27, 2006 | Dale |
20070282564 | December 6, 2007 | Sprague |
20080027590 | January 31, 2008 | Phillips |
20080086241 | April 10, 2008 | Phillips et al. |
20090125225 | May 14, 2009 | Hussain et al. |
20100066587 | March 18, 2010 | Yamauchi et al. |
20100172571 | July 8, 2010 | Yoon et al. |
20110172850 | July 14, 2011 | Paz-Meidan |
20120089295 | April 12, 2012 | Ahn et al. |
20120182392 | July 19, 2012 | Kearns et al. |
20130141247 | June 6, 2013 | Ricci |
20130231779 | September 5, 2013 | Purkayastha et al. |
20130325244 | December 5, 2013 | Wang et al. |
20140188325 | July 3, 2014 | Johnson et al. |
20150158182 | June 11, 2015 | Farlow et al. |
20150355638 | December 10, 2015 | Field et al. |
20170131102 | May 11, 2017 | Wirbel et al. |
20170203446 | July 20, 2017 | Dooley et al. |
20180173242 | June 21, 2018 | Lalonde et al. |
20190079523 | March 14, 2019 | Zhu et al. |
20190187703 | June 20, 2019 | Millard et al. |
20190318277 | October 17, 2019 | Goldman et al. |
20200117214 | April 16, 2020 | Jonak et al. |
20200249033 | August 6, 2020 | Gelhar |
20200409382 | December 31, 2020 | Herman et al. |
20210041243 | February 11, 2021 | Fay et al. |
20210141389 | May 13, 2021 | Jonak et al. |
20220388170 | December 8, 2022 | Merewether |
20220390950 | December 8, 2022 | Yamauchi |
20220390954 | December 8, 2022 | Klingensmith |
203371557 | January 2014 | CN |
108052103 | May 2018 | CN |
111604916 | September 2020 | CN |
211956515 | November 2020 | CN |
112034861 | December 2020 | CN |
113633219 | November 2021 | CN |
H09-134217 | November 2003 | JP |
2006-011880 | January 2006 | JP |
2006-239844 | September 2006 | JP |
2008-072963 | April 2008 | JP |
2013-250795 | December 2013 | JP |
2014-123200 | July 2014 | JP |
2016-081404 | May 2016 | JP |
2016-103158 | June 2016 | JP |
2017-182502 | October 2017 | JP |
2019-021197 | February 2019 | JP |
10-1121763 | March 2012 | KR |
WO 2007/051972 | May 2007 | WO |
WO2017/090108 | June 2017 | WO |
WO 2020/076422 | April 2020 | WO |
WO 2021/025707 | February 2021 | WO |
WO 2022/256811 | December 2022 | WO |
WO 2022/256815 | December 2022 | WO |
WO 2022/256821 | December 2022 | WO |
- International Search Report, PCT/US2019/051092, dated Apr. 30, 2020, 15 pages.
- Abraham et al., “A Topological Approach of Path Planning for Autonomous Robot Navigation in Dynamic Environments”, The 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems Oct. 11-15, 2009.
- Alkautsar, “Topological Path Planning and Metric Path Planning”, https://medium.com/@arifmaulanaa/topological-path-planning-and-metric-path-planning-5c0fa7f107f2, downloaded Feb. 13, 2023, 3 pages.
- Boost.org, “buffer (with strategies)”, https://www.boost.org/doc/libs/1_75_0/libs/geometry/doc/html/geometry/reference/algorithms/buffer/buff er_7_with_strategies.html, downloaded Feb. 1, 2023, 5 pages.
- Buchegger et al. “An Autonomous Vehicle for Parcel Delivery in Urban Areas” International Conference on Intelligent Transportation Systems (ITSC) Nov. 2018.
- Cao, “ Topological Path Planning For Crowd Navigation”, https://www.ri.cmu.edu/app/uploads/2019/05/thesis.pdf, May 2019, 24 pages.
- Collins et al. “Efficient Planning for High-Speed MAV Flight in Unknown Environments Using Online Sparse Topological Graphs” IEEE International Conference on Robotics and Automation (ICRA) Aug. 2020.
- github.com, “cartographer-project/cartographer”, https://github.com/cartographer-project/cartographer, downloaded Feb. 1, 2023, 4 pages.
- International Search Report and Written Opinion for PCT Application No. PCT/US2019/047804, dated Apr. 6, 2020, 14 pages.
- International Search Report and Written Opinion for PCT Application No. PCT/US2019/051092, dated Apr. 30, 2020, 12 pages.
- International Search Report and Written Opinion for PCT Application No. PCT/US2022/072710, dated Sep. 27, 2022, 16 pages.
- International Search Report and Written Opinion for PCT Application No. PCT/US2022/072703, dated Sep. 22, 2022, 13 pages.
- International Search Report and Written Opinion for PCT Application No. PCT/US2022/072717, dated Sep. 30, 2022, 26 pages.
- Kuipers et al. “A Robot Exploration and Mapping Strategy Based on a Semantic Hierarchy of Spatial Representations” Journal of Robotics & Autonomous Systems vol. 8, 1991, pp. 47-63.
- Leon et al., “TIGRE: Topological Graph based Robotic Exploration”, 2017 European Conference on Mobile Robots (ECMR), Paris, France, 2017, pp. 1-6, doi: 10.1109/ECMR.2017.8098718.
- McCammon et al., “Topological path planning for autonomous information gathering”, Autonomous Robots 45, pp. 821-842 (2021), https://doi.org/10.1007/s10514-021-10012-x.
- Mendes et al., “ICP-based pose-graph SLAM”, International Symposium on Safety, Security and Rescue Robotics (SSRR), Oct. 2016, Lausanne, Switzerland, pp. 195-200, ff10.1109/SSRR.2016.7784298, hal-01522248.
- Poncela et al., “Efficient integration of metric and topological maps for directed exploration of unknown environments”, Robotics and Autonomous Systems, Elsevier Bv, Amsterdam, NL, vol. 41, No. 1, Oct. 31, 2002 (Oct. 31, 2002), p. 21-39.
- Tang, “Introduction to Robotics”, The Wayback Machine, https://web.archive.org/web/20160520085742/http://www.cpp.edu:80/˜ftang/courses/CS521/, downloaded Feb. 3, 2023, 58 pages.
- Thrun et al., “The GraphSLAM Algorithm with Applications to Large-Scale Mapping of Urban|Structures”, The International Journal of Robotics Research, vol. 25, No. 5-6, May-Jun. 2006, pp. 403-429.
- Video game, “Unreal Engine 5”, https://docs.unrealengine.com/5.0/en-US/basic-navigation-in-unreal- engine/, downloaded Feb. 1, 2023, 15 pages.
- Whelan et al., “ElasticFusion: Dense SLAM Without A Pose Graph”, http://www.roboticsproceedings.org/rss11/p01.pdf, downloaded Feb. 1, 2023, 9 pages.
- wilipedia.org, “Buffer (GIS)”, http://wiki.gis.com/wiki/index.php/Buffer_(GIS)# :˜: text=A%20'polygon%20buffer'%20is%20a,Buffer%20 around%20line%20features, downloaded Feb. 1, 2023, 4 pages.
- wikipedia.org, “Probabilistic roadmap”, https://en.wikipedia.org/wiki/Probabilistic_roadmap, downloaded Feb. 1, 2023, 2 pages.
- wikipedia.org, “Rapidly-exploring random tree”, https://en.wikipedia.org/wiki/Rapidly-exploring_random_tree, downloaded Feb. 1, 2023, 7 pages.
- wikipedia.org, “Visibility graph”, https://en.wikipedia.org/wiki/Visibility_graph, downloaded Feb. 1, 2023, 3 pages.
- Yamauchci et al. “Place Recognition in Dynamic Environments” Journal of Robotic Systems, Special Issue on Mobile Robots, vol. 14, No. 2, Feb. 1997, pp. 107-120.
- Yamauchi et al. “Spatial Learning for Navigation in Dynamic Environments” IEEE Transactions on Systems, Man, and Cybernetics—Part B: Cybernetics, Special Issue on Learning Autonomous Robots, vol. 26, No. 3, Jun. 1996, pp. 496-505.
Type: Grant
Filed: Feb 1, 2022
Date of Patent: Oct 3, 2023
Patent Publication Number: 20220155078
Assignee: Boston Dynamics, Inc. (Waltham, MA)
Inventors: Gina Christine Fay (Lexington, MA), Alfred Rizzi (Cambridge, MA)
Primary Examiner: Richard M Camby
Application Number: 17/649,620
International Classification: G01C 21/20 (20060101); G05D 1/00 (20060101); G05D 1/02 (20200101);