Distributed Robotic Guidance
The different illustrative embodiments provide an apparatus that includes a computer system, a number of structured light generators, and a number of mobile robotic devices. The computer system is configured to generate a path plan. The number of structured light generators is configured to project the path plan. The number of mobile robotic devices is configured to detect and follow the path plan.
Latest Deere & Company Patents:
This application is related to commonly assigned and co-pending U.S. patent application Ser. No. ______ (Attorney Docket No. 18444-US) entitled “Modular and Scalable Positioning and Navigation System”; and U.S. patent application Ser. No. ______ (Attorney Docket No. 18445-US) entitled “Asymmetric Stereo Vision System” all of which are hereby incorporated by reference.
FIELD OF THE INVENTIONThe present invention relates generally to systems and methods for robotic navigation and more particularly to systems and methods for guiding mobile robotic devices. Still more specifically, the present disclosure relates to a method and system utilizing a structured light output for guiding mobile robotic devices.
BACKGROUND OF THE INVENTIONThe use of robotic devices to perform physical tasks has increased in recent years. Mobile robotic devices can be used to perform a variety of different tasks. These mobile devices may operate in semi-autonomous or fully autonomous modes. Some robotic devices are constrained to operate in a contained area, using different methods to obtain coverage within the contained area. Mobile robotic devices often rely on dead reckoning or use of a global positioning system to achieve area coverage. These systems tend to be either inefficient or cost-prohibitive.
SUMMARYOne or more of the different illustrative embodiments provide a method for providing a path plan. A guidance projection using a three-dimensional map is generated. The guidance projection is then projected onto a contained area.
The different illustrative embodiments further provide a method for navigating a path. A guidance projection in a contained area is detected. A path within the guidance projection is identified. The path is then followed.
The different illustrative embodiments further provide an apparatus that includes a computer system, a number of structured light generators, and a number of mobile robotic devices. The computer system is configured to generate a path plan. The number of structured light generators is configured to project the path plan. The number of mobile robotic devices is configured to detect and follow the path plan.
The features, functions, and advantages can be achieved independently in various embodiments of the present invention or may be combined in yet other embodiments in which further details can be seen with reference to the following description and drawings.
The novel features believed characteristic of the illustrative embodiments are set forth in the appended claims. The illustrative embodiments, however, as well as a preferred mode of use, further objectives and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment of the present invention when read in conjunction with the accompanying drawings, wherein:
With reference to the figures and in particular with reference to
Robotic service environment 100 contains worksite 102. Worksite 102 includes guidance system 104, number of mobile robotic devices 106, and contained area 108. As used herein, a number refers to one or more items. Guidance system 104 provides a navigational path for number of mobile robotic devices 106. Contained area 108 includes guidance projection 110, number of objects 112, and number of umbral areas 116. Guidance projection 110 is the navigational path as projected by guidance system 104 onto contained area 108. Contained area 108 is a bounded area in which mobile robotic devices 106 work. Contained area 108 may be determined by human input, e.g., a human may input property boundaries, or by physical limitations. Physical limitations may include, without limitation, fences, walls, landscaping, water, or any other physical limitation that bounds contained area 108. Guidance projection 110 is in the form of a structured light emission and contains instructions for number of mobile robotic devices 106. Number of mobile robotic devices 106 performs tasks using the instructions included in guidance projection 110.
Number of objects 112 may include objects such as, without limitation, trees, fences, playground equipment, light poles, fire hydrants, walls, furniture, railings, fixtures and/or any other object that may be present in contained area 108. Number of umbral areas 116 occurs when number of objects 112 in worksite 102 block guidance projection 110 from being projected onto portions of contained area 108. An umbral area is a shadowed area. In an illustrative example, when guidance system 104 emits guidance projection 110 onto contained area 108, number of objects 112 in worksite 102 may create a shadow on a portion of contained area 108. This shadow is a result of structured light emitted by guidance projection 110 inadequately reaching every portion of contained area 108. This shadow is an example of an umbral area in number of umbral areas 116.
In an illustrative embodiment, number of mobile robotic devices 106 may operate within worksite 102. Guidance system 104 may project guidance projection 110 onto contained area 108. Number of mobile robotic devices 106 may identify and follow a path, such as path 118, within guidance projection 110.
In an illustrative embodiment, number of mobile robotic devices 106 is stored in storage location 120. Operator 122 may utilize wireless control device 124 to guide mobile robotic devices 106 between storage location 120 and contained area 108. Once number of mobile robotic devices 106 are within contained area 108, number of mobile robotic devices 106 operates using guidance projection 110.
In an illustrative embodiment, in portions of contained area 108 that are farthest from guidance system 104, guidance projection 110 may be faint and wide due to geometric expansion of the guidance projection. Methods known in the art for analyzing the intensity of the reflected light across the guidance projection may be used to better define any line projections in the guidance projection.
The illustration of robotic service environment 100 in
The different illustrative embodiments recognize and take into account that currently used methods for robotic navigation rely on dead reckoning or localization and path planning. Systems which rely on dead reckoning are inexpensive but also inefficient. Given enough time, a system relying on dead reckoning will probably cover the work area. However, the amount of time for 99.99% certainty that the total work area is covered may be very long. Additionally, systems which rely on localization and path planning system are more accurate and efficient in area coverage but often cost-prohibitive. One example of a localization and path planning system is a global positioning system using satellites. Precise area coverage can be achieved using this localization system. However, the main drawbacks are cost, energy consumption (e.g., battery operation reduction), and accuracy issues due to satellite signal obstructions. Signals from satellites may be replaced with acoustic or electromagnetic signals from ground sources which are then used by mobile robotic devices to triangulate position. However, overcoming accuracy requirement issues from signal topology, multi-path, attenuation, as well as power supply needs for beacons increases the system costs.
Thus, one or more of the different illustrative embodiments provide an apparatus that includes a computer system, a number of structured light generators, and a number of mobile robotic devices. The computer system is configured to generate a path plan. The number of structured light generators is configured to project the path plan. The number of mobile robotic devices is configured to detect and follow the path plan.
An illustrative embodiment further provides a method for providing a path plan. A guidance projection using a three-dimensional map is generated. The guidance projection is then projected onto a contained area.
An additional illustrative embodiment provides a method and system for navigating a path. A guidance projection in a contained area is detected. A path within the guidance projection is identified. The path is then followed.
Illustrative embodiments provide a guidance system for a mobile robotic device which enables safety, low cost, extended battery life or a smaller battery, and high quality area coverage. The illustrative embodiments contribute to a low robot weight which enhances safety, low cost, extended battery life or a smaller battery.
The features, functions, and advantages can be achieved independently in various embodiments of the present invention or may be combined in yet other embodiments in which further details can be seen with reference to the following description and drawings.
With reference now to
In this illustrative example, data processing system 200 includes communications fabric 202, which provides communications between processor unit 204, memory 206, persistent storage 208, communications unit 210, input/output (I/O) unit 212, and display 214. Depending on the particular implementation, different architectures and/or configurations of data processing system 200 may be used.
Processor unit 204 serves to execute instructions for software that may be loaded into memory 206. Processor unit 204 may be a set of one or more processors or may be a multi-processor core, depending on the particular implementation. Further, processor unit 204 may be implemented using one or more heterogeneous processor systems in which a main processor is present with secondary processors on a single chip. As another illustrative example, processor unit 204 may be a symmetric multi-processor system containing multiple processors of the same type.
Memory 206 and persistent storage 208 are examples of storage devices 216. A storage device is any piece of hardware that is capable of storing information, such as, for example without limitation, data, program code in functional form, and/or other suitable information either on a temporary basis and/or a permanent basis. Memory 206, in these examples, may be, for example, a random access memory or any other suitable volatile or non-volatile storage device. Persistent storage 208 may take various forms depending on the particular implementation. For example, persistent storage 208 may contain one or more components or devices. For example, persistent storage 208 may be a hard drive, a flash memory, a rewritable optical disk, a rewritable magnetic tape, or some combination of the above. The media used by persistent storage 208 also may be removable. For example, a removable hard drive may be used for persistent storage 208.
Communications unit 210, in these examples, provides for communications with other data processing systems or devices. In these examples, communications unit 210 is a network interface card. Communications unit 210 may provide communications through the use of either or both physical and wireless communications links.
Input/output unit 212 allows for input and output of data with other devices that may be connected to data processing system 200. For example, input/output unit 212 may provide a connection for user input through a keyboard, a mouse, and/or some other suitable input device. Further, input/output unit 212 may send output to a printer. Display 214 provides a mechanism to display information to a user.
Instructions for the operating system, applications and/or programs may be located in storage devices 216, which are in communication with processor unit 204 through communications fabric 202. In these illustrative examples the instruction are in a functional form on persistent storage 208. These instructions may be loaded into memory 206 for execution by processor unit 204. The processes of the different embodiments may be performed by processor unit 204 using computer implemented instructions, which may be located in a memory, such as memory 206.
These instructions are referred to as program code, computer usable program code, or computer readable program code that may be read and executed by a processor in processor unit 204. The program code in the different embodiments may be embodied on different physical or tangible computer readable media, such as memory 206 or persistent storage 208.
Program code 218 is located in a functional form on computer readable media 220 that is selectively removable and may be loaded onto or transferred to data processing system 200 for execution by processor unit 204. Program code 218 and computer readable media 220 form computer program product 222 in these examples. In one example, computer readable media 220 may be in a tangible form, such as, for example, an optical or magnetic disc that is inserted or placed into a drive or other device that is part of persistent storage 208 for transfer onto a storage device, such as a hard drive that is part of persistent storage 208. In a tangible form, computer readable media 220 also may take the form of a persistent storage, such as a hard drive, a thumb drive, or a flash memory that is connected to data processing system 200. The tangible form of computer readable media 220 is also referred to as computer recordable storage media. In some instances, computer recordable media 220 may not be removable.
Alternatively, program code 218 may be transferred to data processing system 200 from computer readable media 220 through a communications link to communications unit 210 and/or through a connection to input/output unit 212. The communications link and/or the connection may be physical or wireless in the illustrative examples. The computer readable media also may take the form of non-tangible media, such as communications links or wireless transmissions containing the program code.
In some illustrative embodiments, program code 218 may be downloaded over a network to persistent storage 208 from another device or data processing system for use within data processing system 200. For instance, program code stored in a computer readable storage medium in a server data processing system may be downloaded over a network from the server to data processing system 200. The data processing system providing program code 218 may be a server computer, a client computer, or some other device capable of storing and transmitting program code 218.
The different components illustrated for data processing system 200 are not meant to provide architectural limitations to the manner in which different embodiments may be implemented. The different illustrative embodiments may be implemented in a data processing system including components in addition to or in place of those illustrated for data processing system 200. Other components shown in
As another example, a storage device in data processing system 200 is any hardware apparatus that may store data. Memory 206, persistent storage 208, and computer readable media 220 are examples of storage devices in a tangible form.
In another example, a bus system may be used to implement communications fabric 202 and may be comprised of one or more buses, such as a system bus or an input/output bus. Of course, the bus system may be implemented using any suitable type of architecture that provides for a transfer of data between different components or devices attached to the bus system. Additionally, a communications unit may include one or more devices used to transmit and receive data, such as a modem or a network adapter. Further, a memory may be, for example, memory 206 or a cache, such as found in an interface and memory controller hub that may be present in communications fabric 202.
With reference now to
Computer system 302 is configured to execute mapping, planning, and execution process 306, behavioral codes process 308, and monitoring process 310. Additionally, computer system 302 includes user interface 312 and segment codings 313. The mapping, planning, and execution process 306 may include projected path planning algorithms 314 to generate path plan 316. Projected path planning algorithms 314 may include algorithms such as a boustrouphadon cellular decomposition algorithm.
Behavioral codes process 308 provides navigational instructions to a mobile robotic device. Behavioral codes process 308 may provide navigational instruction implemented as a solid line which could mean “follow the line.” Additionally, behavioral codes process 308 may provide navigational instruction implemented as a pattern such as “0101 0010” where “0” is no light projection and “1” is a light projection of a certain length. This pattern may be, for example, an encoding of the letter “R” based on the American Standard Code for Information Interchange (ASCII). ASCII is a character-encoding scheme based on the ordering of the English alphabet. In an illustrative example, when this type of pattern or encoding is detected, a mobile robotic device could interpret the encoded letter “R” as instruction to tightly go around an object detected on the mobile robotic device's “right” side until it encounters a solid line. This behavior continues even when the mobile robotic device is in the umbra of the given object. Behavioral codes process 308 may also provide instructions for the mobile robotic device that continue to guide the mobile robotic device even when the light projection or path plan is no longer visible.
Behavioral codes process 308 includes umbral guide process 318 and stop process 319. Umbral guide process 318 generates umbral guides. An umbral guide is a type of behavioral code. An umbral guide may be embedded within a guidance projection that provide guidance to a mobile robotic device in traversing a number of umbral areas, such as umbral areas 116 in
Monitoring process 310 may monitor contained area 108 in
Number of structured light generators 304 contains number of processor units 322, number of cameras 324, number of communications units 326, and number of projectors 328. Number of processor units 322 may be an example of one implementation of data processing system 200 in
Number of structured light generators 304 may be used in concert to emit path plan 316. A path plan is a set of navigational instructions that define the path a mobile robotic device may follow. Path plan 316 may include, for example, without limitation, instructions for guiding a number of mobile robotic devices, projecting decorative displays on a contained area, navigational instructions for the number of mobile robotic devices to follow even when the path plan or structured light is no longer visible, and/or any other suitable instructions. In the illustrative example of navigational instructions for after a path plan or structured light is no longer visible, path plan 316 may include navigational instructions to continue guidance of the mobile robotic device despite visibility limitations, for example. Structured light generator 330 is an example of one implementation of a structured light generator in number of structured light generators 304. Structured light generator 330 includes processor unit 332, number of cameras 334, communications unit 336, and projector 338. Processor unit 332 is configured to execute structured light generator process 340 and monitoring process 342. Structured light generator processor 340 may cause projector 338 to project guidance projection 344.
Guidance projection 344 is an emission of a light frequency. The light frequency emitted as guidance projection 344 may depend on a number of factors, including, without limitation, cost and performance of emitters and detectors, reflective properties in the environment, ambient light in the environment, eye safety for animals and humans, and/or any other factor. For example, guidance projection 344 may be an emission of near infrared light. Guidance projection 344 may emit light in a continuous or pulsed pattern.
In one illustrative embodiment, a pulsed pattern may allow a higher light intensity to be used while maintaining eye safety. In the illustrative example of pulsed light, there may be a need for an additional synchronization of structured light generator 330 and number of mobile robotic devices 346. There are a number of ways to synchronize the structured light generator and the number of mobile robotic devices, which are known in the art. One way to synchronize is to have clocks in structured light generator 330 and number of mobile robotic devices 346 synchronized. The pulsed pattern could be sent at the start of each new 0.1 second interval for whatever duration is desired.
In another illustrative embodiment, additional information could be encoded in guidance projection 344 using color, such as yellow=slow speed, green=medium speed, blue=maximum speed, for example. Monitoring process 342 may monitor contained area 108 in
In an illustrative example, monitoring process 342 may use number of images 345 to track the positions of number of mobile robotic devices 346 relative to guidance projection 344. The positions of number of mobile robotic devices 346 may then be used to ensure guidance projection 344 is available to mobile robotic devices 346 and/or to discontinue aspects of line following process 504. For example, one aspect of line following process 504 may be to direct a mobile robotic device down a line path contained in the guidance projection 344. This aspect may no longer be needed to direct number of mobile robotic devices 346 based on the position of number of mobile robotic devices 346.
Monitoring process 342 may also detect changes in contained area 108 in
Additionally, number of images 345 may be used to implement a security system. For example, number of images 345 could be used to detect a moving object, roughly the size and shape of a human, in a work site such as work site 102 in
Once mapping, planning, and execution process 306 generates path plan 316, path plan 316 is communicated to structured light generator process 340. Structured light generator process 340 then causes projector 338 to project guidance projection 344. Guidance projection 344 includes path plan 316. Guidance projection 344 is an example of implementation of guidance projection 110 in
Number of mobile robotic devices 346 may detect guidance projection 344 and identify path plan 316 within guidance projection 344. Guidance projection 344 may be projected in a number of sections onto a contained area. Projecting in a number of sections includes projecting simultaneous sections and consecutive sections. Projecting simultaneous sections may be achieved by implementing, for example, a number of projectors, each projector simultaneously projecting a section of guidance projection 344 onto worksite 102 in
In an illustrative embodiment, mapping, planning, and execution process 306 may further include means known in the art for vehicular navigation based on site-specific sensor quality data to increase the pass-to-pass overlap to compensate for increased uncertainty in position, thus reducing the likelihood of skips in the area coverage between passes.
The illustration of mobile guidance system 300 in
Additionally, structured light generator process 340 may cause projector 338 to project decorative displays. Decorative displays may include, without limitation, holiday displays and game displays such as bounds for a badminton court or a four-square grid.
With reference now to
As illustrated, mobile robotic device 400 includes machine controller 402, behavior library 404, sensor system 406, mobility system 408, and communications unit 410.
Machine controller 402 includes, for example, control software 412. Machine controller 402 may be, for example, a data processing system, such as data processing system 200 in
Machine controller 402 may execute processes using control software 412 to control mobility system 408. Mobility system 408 includes, for example, propulsion system 416, steering system 418, braking system 420, and number of mobility components 422. Machine controller 402 may execute processes using control software 412 to access information within control behavior library 404 and sensor system 406. Machine controller 402 may send various commands 428 to these components to operate the mobile robotic device in different modes of operation. Commands 428 may take various forms depending on the implementation. For example, the commands may be analog electrical signals in which a voltage and/or current change is used to control these systems. In other implementations, the commands may take the form of data sent to the systems to initiate the desired actions.
In these examples, propulsion system 416 may propel or move mobile robotic device 400 in response to commands from machine controller 402. Propulsion system 416 may maintain or increase the speed at which a mobile robotic device moves in response to instructions received from machine controller 402. Propulsion system 416 may be an electrically controlled propulsion system. Propulsion system 416 may be, for example, an internal combustion engine, an internal combustion engine/electric hybrid system, an electric engine, or some other suitable propulsion system.
Steering system 418 may control the direction or steering of mobile robotic device 400 in response to commands received from machine controller 402. Steering system 418 may be, for example, an electrically controlled hydraulic steering system, an electrically driven rack and pinion steering system, an Ackerman steering system, a skid-steer steering system, a differential steering system, or some other suitable steering system.
Braking system 420 may slow down and/or stop mobile utility vehicle 400 in response to commands from machine controller 402. Braking system 420 may be an electrically controlled braking system. This braking system may be, for example, a hydraulic braking system, a friction braking system, a regenerative braking system, or some other suitable braking system that may be electrically controlled.
Number of mobility components 422 may provide the means for transporting mobile robotic device 400. Number of mobility components 422 includes, without limitation, wheels 424 and tracks 426.
Sensor system 406 may be a set of sensors used to collect information about the environment around a mobile robotic device. In these examples, the information is sent to machine controller 402 or computer system 302 in
Behavior library 404 contains various behavioral processes specific to mobile robotic device coordination that can be called and executed by machine controller 402. Control software 412 may access behavior library 404 to identify and/or select behavioral processes to be executed by machine controller 402.
Communication unit 410 may provide multiple communications links to machine controller 402 to receive information. This information includes, for example, data, commands, and/or instructions.
Communication unit 410 may take various forms. For example, communication unit 410 may include a wireless communications system, such as a cellular phone system, a Wi-Fi wireless system, a Bluetooth wireless system, and/or some other suitable wireless communications system. Further, communication unit 410 also may include a communications port, such as, for example, a universal serial bus port, a serial interface, a parallel port interface, a network interface, and/or some other suitable port to provide a physical communications link.
The illustration of mobile robotic device 400 in
With reference now to
As illustrated, behavior library 500 includes various behavioral processes for the mobile robotic device that can be called and executed by a machine controller, such as machine controller 402 in
Behavior library 500 includes, for example, tele-operation process 502, line following process 504, umbral process 506, and object avoidance process 508.
Tele-operation process 502 allows an operator, such as operator 122 in
Line following process 504 utilizes a number of images from a number of cameras associated with a sensor system, such as sensor system 406 in
In an illustrative example, a range of pixels may be defined so the left side of the range is lined up roughly with the line path of the guidance projection, such as guidance projection 344 in
The line path may be found by filtering all colors other than the guidance projection color out of the image and then skeletonizing the remaining pixel group(s). Skeletonizing the remaining pixel group(s) is the process of removing as many pixels as possible without affecting the general shape of the line path. Line following process 504 may further utilize a number of images to identify the length of line segments in the line path and then compare the identified line segment lengths to a reference line segment length. Similarly, the distance between segments in the line path can be measured and compared to a reference segment length. The lengths and gaps can be interpreted using a coding scheme. Encoding scheme may include, but is not limited to ASCII, Morse code, and bar code. Reference line segment length may be obtained using a segment codings database, such as segment coding 313 or segment coding 341 in
Umbral process 506 may be used when mobile robotic device receives behavioral codes which include umbral guides. Object avoidance process 508 may be used in conjunction with one or more of the other behavior processes in behavior library 500 to direct the vehicle movement around a detected object.
Elements of behavior library 500, such as tele-operation process 502, line following process 504, umbral process 506, and object avoidance process 508 may be used independently or in any combination.
The illustration of behavior library 500 in
With reference now to
As illustrated, sensor system 600 includes, for example, number of cameras 602, number of contact sensors 604, odometry sensor 606, global positioning system 608, mobility sensors 610, ambient light sensor 612, structured light sensor 614, visible light sensor 616, and proximity sensor 618. These different sensors may be used to identify the operating environment around a mobile robotic device, such as robotic service environment 100 in
Number of cameras 602 may be a standard still-image camera, which may be used alone for color information or with a second camera to generate stereoscopic, or three-dimensional, images. When the number of cameras 602 is used along with a second camera to generate stereoscopic images, the two or more cameras may be set with different exposure settings to provide improved performance over a range of lighting conditions. Number of cameras 602 may also be a video camera that captures and records moving images. Number of cameras 602 may be capable of taking images of the environment near or around a mobile robotic device, such as mobile robotic device 400 in
The images from number of cameras 602 may be processed using means known in the art to identify a guidance projection or objects in an environment. For example, images from number of cameras 602 may be processed by monitoring process 324 or monitoring process 306 in
Global positioning system 608 may identify the location of the mobile robotic device with respect to other objects in the environment. Global positioning system 608 may be any type of radio frequency triangulation scheme based on signal strength and/or time of flight. Examples include, without limitation, the Global Positioning System, Glonass, Galileo, and cell phone tower relative signal strength. Position is typically reported as latitude and longitude with an error that depends on factors, such as ionospheric conditions, satellite constellation, and signal attenuation from vegetation. Signals from satellites may be replaced with acoustic or electromagnetic signals from ground sources, which are then used by a mobile robotic device to triangulate position.
Mobility sensors 610 are used to safely and efficiently move a mobile robotic device, such as mobile robotic device 106 in
Ambient light sensor 612 measures the amount of ambient light in an environment. Structured light sensor 614 reads structured light projections of a guidance projection, through a camera, and interprets the detected structured light. Structured light sensor 614 may be used to detect objects in an environment. Visible light sensor 616 measures the amount of visible light in an environment. The data collected from ambient light sensor 612 and visible light sensor 616 may be used in selecting an image processing algorithm, such as image processing algorithm 350 in
In an illustrative embodiment, sensor system 600 communicates with machine controller 402 in
In another illustrative embodiment, sensor system 600 communicates with machine controller 402 in
The illustration of sensor system 600 in
With reference now to
Guidance projection includes projected structured light 702. Projected structured light 702 includes number of behavior codes 704 and line path 705. The number of behavior codes 704 includes, for example, without limitation, at least one of number of bar codes 706, number of line patterns 708, and pulse modulation 710. The frequency, duty cycle, and intensity of projected structured light 702 is selected to be easily visible to a mobile robotic device and eye-safe for people and pets in the area. Frequency, duty cycle, and intensity of projected structured light that is easily visible to a mobile robotic device and eye-safe for people and pets in the area is known in the art.
In an illustrative embodiment, projected structured light 702 is projected as line path 705. A mobile robotic device follows line path 705. The mobile robotic device first locates line path 705 using a number of cameras, such as number of cameras 602 in
In another illustrative embodiment, which may be implemented independently or in conjunction with the previous illustrative embodiment, projected structured light 702 is projected as a number of behavior codes 704. Number of behavior codes 704 may include umbral guides created by umbral guide process 318 in
The illustration of guidance projection 700 in
With reference now to
Worksite 800 includes guidance system 802 and contained area 804. The guidance system 802 emits structured light 806 onto contained area 804. Structured light 806 is an example of one implementation of projected structured light 702 in
In an illustrative embodiment, guidance system 802 projects projected path 808 in its entirety. Projecting a projected path in its entirety means the projected path is projected in one piece by one projector rather than in a number of segments and/or by a number of projectors.
The illustration of worksite 800 in
With reference now to
Worksite 900 includes guidance system 902 and contained area 904. The guidance system 902 emits structured light 906 onto a portion of contained area 904. Structured light 906 is an example of one implementation of projected structured light 702 in
The illustration of worksite 900 in
With reference now to
Worksite 1000 includes guidance system 1002 and contained area 1004. The guidance system 1002 emits structured light 1006 onto a portion of contained area 1004. Structured light 1006 is an example of one implementation of projected structured light 702 in
In an illustrative embodiment, a guidance projection, such as guidance projection 344 in
The illustration of worksite 1000 in
With reference now to
Worksite 1100 includes guidance system 1102 and contained area 1104. The guidance system 1102 emits structured light 1106 onto contained area 1104. Structured light 1106 is an example of one implementation of projected structured light 702 in
Mobile robotic device 1110 may follow projected path 1108 until it ends at point 1116. Because object 1112 creates umbra 1114 that precludes structured light 1106 from projecting projected path 1108 into umbra 1114, umbral guides 1118 are used to traverse umbra 1114. Mobile robotic device 1110 uses umbral guides 1118 as landmarks for localization and guidance. Mobile robotic device 1110 may then access a number of umbral behaviors located in behavior library 500 in
In an illustrative embodiment, mobile robotic device 1110 follows projected path 1108 until it ends at 1116. Object 1112 may be avoided and umbra 1114 may be traversed with a sequence of two umbral behaviors. The first umbral behavior would be to circle the object once. The second umbral behavior would be to lock onto semi-circle 1120 on the lower portion of umbral guides 1118 and follow semi-circle 1120 to make a first turn. Then lock onto semi-circle 1122 directly across umbra 1114 and on the upper portion of umbral guides 1118 to traverse umbra 1114, and to follow semi-circle 1122 on the upper portion of umbral guides 1118 to make a second turn. Next lock onto semi-circle 1124 directly across umbra 1114 and on the lower portion of umbral guides 1118 to traverse umbra 1114, and finally to follow semi-circle 1124 on the lower portion of umbral guides 1118 to make a third turn.
The illustration of worksite 1100 in
With reference now to
The process begins by creating a three-dimensional map of a contained area, using a guidance system such as guidance system 300 in
The process then generates a guidance projection, using the three-dimensional map (step 1204). The guidance projection includes a path plan such as path plan 316 in
With reference now to
The process begins by creating a first three-dimensional map of a contained area (step 1302). A first three-dimensional map may be created prior to an occluding season. An occluding season is a time of year in which nature greatly occludes the surface. Examples of occluding seasons, without limitation, are growing seasons such as late spring and summer when vegetation is most active and snow seasons. Thus, prior to a growing season such as very early spring when vegetation is minimal and very early winter before snow has fallen are the best times to create the first three-dimensional map. A growing season is the time of year when climatic conditions are favorable for plant growth.
The process then annotates the first three-dimensional map to form a second three-dimensional map (step 1304). Annotations may be made by a human using a user interface, such as user interface 312 in
Next, the process creates a third three-dimensional map using the second three-dimensional map and a current data (step 1306). Current data may include, without limitation, a current three-dimensional map that may include increased vegetation and new objects. In an illustrative embodiment, a structured light generator projects dots or lines of light onto worksite 102 in
With reference now to
The process begins by creating a three-dimensional map (step 1402). The three-dimensional map may comprise a contained area. The process identifies umbral areas (step 1404). The process creates a number of umbral guides (step 1406) for the umbral areas identified. The process generates a guidance projection using the three-dimensional map and the number of umbral guides (step 1408). The process then projects the guidance projection onto a contained area (step 1410), with the process terminating thereafter.
With reference now to
The process starts by detecting a guidance projection in a contained area (step 1502). The process identifies a path within the guidance projection (step 1504). A path may be a line path such as line path 714 in
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Additionally, as used herein, the phrase “at least one of”, when used with a list of items, means that different combinations of one or more of the listed items may be used and only one of each item in the list may be needed. For example, “at least one of item A, item B, and item C” may include, for example, without limitation, item A or item A and item B. This example also may include item A, item B, and item C or item B and item C. In other examples, “at least one of” may be, for example, without limitation, two of item A, one of item B, and ten of item C; four of item B and seven of item C; and other suitable combinations. As used herein, a number of items means one or more items.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
The different illustrative embodiments recognize and take into account that currently used methods for robotic navigation rely on dead reckoning or localization and path planning. Systems which rely on dead reckoning are inexpensive but inefficient. Given enough time, a system relying on dead reckoning or random area coverage will probably cover the work area. However, the amount of time for 99.99% certainty that the total work area is covered may be very long. Additionally, systems which rely on localization and path planning system are more accurate and efficient in area coverage but often cost-prohibitive. One example of a localization and path planning system is a global positioning system using satellites. Precise area coverage can be achieved using this localization system. However, the main drawbacks are cost, energy consumption (e.g., battery operation reduction), and accuracy issues due to satellite signal obstructions. Signals from satellites may be replaced with acoustic or electromagnetic signals from ground sources which are then used by mobile robotic devices to triangulate position. However, overcoming accuracy requirement issues from signal topology, multi-path, attenuation, as well as power supply needs for beacons increases the system costs.
Thus, one or more of the different illustrative embodiments provide an apparatus that includes a computer system, a number of structured light generators, and a number of mobile robotic devices. The computer system is configured to generate a path plan. The number of structured light generators is configured to project the path plan. The number of mobile robotic devices is configured to detect and follow the path plan.
An illustrative embodiment further provides a method and system for providing a path plan. A guidance projection using a three-dimensional map is generated. The guidance projection is then projected onto a contained area.
An additional illustrative embodiment provides a method and system for navigating a path. A guidance projection in a contained are is detected. A path within the guidance projection is identified. The path is then followed.
Illustrative embodiments provide a guidance system for a mobile robotic device which enables safety, low cost, extended battery life or a smaller battery, and high quality area coverage. The illustrative embodiments contribute to a low robot weight which enhances safety, low cost, extended battery life or a smaller battery.
The description of the different advantageous embodiments has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art. Further, different embodiments may provide different advantages as compared to other embodiments. The embodiment or embodiments selected are chosen and described in order to best explain the principles of the invention, the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
Claims
1-18. (canceled)
19. An apparatus comprising:
- a computer system configured to generate a path plan; and
- a number of structured light generators configured to be controlled by the computer system and project the path plan.
20. The apparatus of claim 19, wherein the computer system further comprises:
- a user interface;
- a mapping, planning, and execution process configured to generate a path plan;
- a behavioral codes process configured to be accessed by the mapping, planning, and execution process; and
- a monitoring process configured to monitor execution of the path plan by a number of mobile robotic devices.
21. The apparatus of claim 19, wherein each structured light generator in the number of structured light generators further comprises:
- a processor unit having a structured light generator process and a monitoring process;
- a number of cameras configured to take a number of images of a contained area, wherein the number of images are sent to the monitoring process;
- a communications unit; and
- a projector configured to project structured light.
22. The apparatus of claim 20, wherein the mapping, planning, and execution process controls the number of structured light generators to project the path plan to control the number of mobile robotic devices.
23. The apparatus of claim 20, wherein the number of mobile robotic devices follow the path plan projected by the number of structured light generators.
24. The apparatus of claim 19, wherein the path plan is at least one of a line path and a number of behavior codes.
25. The apparatus of claim 19, wherein the path plan is used for at least one of guiding a number of mobile robotic devices, projecting decorative displays on a contained area, and providing instructions for the number of mobile robotic devices to continue guidance when the path plan is no longer visible.
26-29. (canceled)
30. A computer program product comprising:
- a computer recordable media;
- computer usable program code, stored on the computer recordable media, for controlling a structured light generator to project a path plan for a mobile robotic device to follow.
31. The computer program product of claim 30, further comprising:
- computer usable program code, stored on the computer recordable media, for generating a map for the structured light generator to use in projecting the path plan to control the mobile robotic device as the mobile robotic device follows the path plan.
32. The computer program product of claim 31, wherein the map is a three-dimensional map.
33. The computer program product of claim 31, wherein the step of generating the map further comprises:
- computer usable program code, stored on the computer recordable media, for identifying a number of umbral areas and creating a number of umbral guides for the number of umbral areas, wherein the path plan includes the number of umbral guides.
34. The computer program product of claim 30, wherein the path plan is projected onto a contained area.
35. A computer program product comprising:
- a computer recordable media;
- computer usable program code, stored on the computer recordable media, for controlling a structured light generator to project a path plan and generating a projected code within the path plan that provides guidance instructions to a mobile robotic device.
36. The computer program product of claim 35, wherein the projected code is generated and projected while the path plan to guide the mobile robotic device is visible.
37. The computer program product of claim 36, wherein the mobile robotic device continues to follow the guidance instructions after the path plan is no longer visible.
Type: Application
Filed: Feb 13, 2012
Publication Date: Jun 7, 2012
Applicant: Deere & Company (Moline, IL)
Inventor: Noel Wayne Anderson (Fargo, ND)
Application Number: 13/372,059
International Classification: G05D 1/00 (20060101); G01C 21/00 (20060101);