Intelligent trajectory adviser system for unmanned aerial vehicles in complex environments
Systems and methods are provided for improving the flight safety of fixed- and rotary-wing unmanned aerial systems (UAS) operating in complex dynamic environments, including urban cityscapes. Sensors and computations are integrated to predict local winds and promote safe operations in dynamic urban regions where GNSS and other network communications may be unavailable. The system can be implemented onboard a UAS and does not require in-flight communication with external networks. Predictions of local winds (speed and direction) are created using inputs from sensors that scan the local environment. These predictions are then used by the UAS guidance, navigation, and control (GNC) system to determine safe trajectories for operations in urban environments.
Latest United States of America as Represented by the Administrator of NASA Patents:
- White molecular adsorber coating system
- NTAC augmented nuclear electric propulsion and/or nuclear thermal propulsion
- System and method for lift augmentation of atmospheric entry vehicles during aerocapture and entry, descent, and landing maneuvers
- Solid state data recorder (SSDR) for use with form-factor avionics systems
- Method of coating high atomic number metals onto oxygen or hydroxyl rich surfaces
This application claims the benefit of U.S. Provisional Application No. 62/486,216, filed Apr. 17, 2017, and U.S. Provisional Application No. 62/487,283, filed Apr. 19, 2017, both of which are herein incorporated by reference in their entireties.
ORIGIN OF THE INVENTIONThe invention described herein was made by an employee of the United States Government and may be manufactured and used by or for the Government of the United States of America for governmental purposes without the payment of any royalties thereon or therefor.
FIELD OF THE INVENTIONThe present embodiments relate to unmanned aerial systems, more particularly, to operating unmanned aerial systems without in-flight communication or external networks.
BACKGROUND OF THE INVENTIONUnmanned Aerial Systems (UAS) include fixed wing and rotary-wing unmanned aerial vehicles (UAV), commonly known as drones, in communication with a controller. Remote-controlled drones rely on an operator on the ground. Truly unmanned systems are controlled autonomously by onboard computers. Unmanned Aerial Systems are useful for purposes such as package delivery, mapping, surveying, and surveillance. However, a number of challenges are faced by current systems operating in complex environments, such as urban areas.
Position calculation typically uses navigation signals such as signals from global navigation satellite system (GNSS), WiFi, cellular, or other navigation signals from similar networks. However, there are a number of terrains where the signal from GNSS or other networks is weak, shadowed, unreliable, or unavailable, including areas such as canyons, mountainous areas, and urban canyons, i.e., the lower altitudes within urban cityscapes between buildings. Unavailable navigation signals hampers use of an UAS systems in urban environments. Prior approaches to overcoming position calculation include using dead reckoning to estimate position. However, such systems rely on both accurate inertial measurement systems and a known initial state. It would be desirable to have a UAS that overcomes the above challenges without the limitations of previous approaches.
SUMMARY OF THE INVENTIONIt is a feature of illustrative embodiments of the present invention to provide a method and a system for an intelligent trajectory adviser system for unmanned aerial vehicles (UAVs) in complex environments. Finding location without the use of a GNSS and allowing a UAS to predict wind without having a real-time connection to navigation or wind data from an external source are desired for creating an accurate trajectory adviser system for a UAS. In view of the foregoing, it will be appreciated that providing such a system would be a significant advancement in the art. Because of the difficulty of getting clear, reliable GNSS signals while traveling through complex environments, current UAS systems are incapable of aerial navigation through urban cityscapes while maintaining precise knowledge of their longitude, latitude, and altitude. Current approaches assume that robust connections will exist between the UAS and control stations, and that GNSS and other location services will be uninterrupted. However, as noted above, such communication is often unavailable within urban canyons and in other areas.
Winds hamper autonomous UAS navigation without external location services by disturbing the motion of the UAS, among other challenges. Urban environments are dynamic, and building locations and layouts can accelerate and decelerate winds, while also causing winds to come from non-intuitive directions. Furthermore, real-time data and exact knowledge of building sizes, shapes, and positions are often unavailable for real-time navigation. A system that allows the prediction of the routes that would involve dangerous winds would greatly add to the safety and operability of a UAS system within an urban environment, and could also be used for a variety of other purposes.
In some embodiments of the invention, the two main problems for UAS autonomy can be approached by using a machine-learning process, as well as use of machine training data obtained from scans of the area, such as wind field data or location data. These problems can be addressed by a method for providing real-time trajectory information, the method comprising
-
- (A) training a neural network with pre-existing landscape data along an intended flight route of the unmanned aerial system;
- (B) uploading the neural network that is trained with the pre-existing landscape data to a computer onboard the unmanned aerial system;
- (C) while being flown along the intended flight route: scanning a peripheral environment around the unmanned aerial system to produce imagery of the unmanned aerial system's current surroundings; determining actual positions of the unmanned aerial system using the scanned imagery, the neural network, and the pre-existing landscape data, wherein the actual positions are determined without the use of GNSS, cellular data, WiFi and other network communication external to the unmanned aerial system; and, correcting the position of the unmanned aerial system when the actual position is not along the intended flight route.
Embodiments of the present invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:
Before the present methods and systems for an intelligent trajectory adviser system for unmanned aerial vehicles (UAVs) in complex environments are disclosed and described, it is to be understood that this invention is not limited to the particular configurations, process steps, and materials disclosed herein, and as such, configurations, process steps, and materials may vary somewhat without departing from the scope and spirit of the invention. It is also to be understood that the terminology employed herein is used for the purpose of describing particular embodiments only and is not intended to be limiting since the scope of the present invention will be limited only by the appended claims and equivalents thereof. The publications and other reference materials referred to herein to describe the background of the invention and to provide additional detail regarding its practice are hereby incorporated by reference. The references discussed herein are provided solely for their disclosure prior to the filing date of the present application. Nothing herein is to be construed as an admission that the inventors are not entitled to distinguish the presently claimed inventions from such disclosures.
It must be noted that, as used in this specification and the appended claims, the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a sensor” includes configurations that involve multiple sensors, or multiple types of sensors (such as visual sensors, infrared, LIDAR, radar, sonar, etc.). Unless defined otherwise, all technical and scientific terms used herein have the same meanings as commonly understood by one of ordinary skill in the art.
In describing and claiming the present invention, the following terminology will be used in accordance with the definitions set out below.
As used herein, “comprising,” “including,” “containing,” “characterized by,” and grammatical equivalents thereof are inclusive or open-ended terms that do not exclude additional unrecited elements or method steps.
Providing GNSS-Free Navigation
The intelligent trajectory adviser system includes one or more neural networks loaded onto the computer of a UAS. These one or more neural networks are trained to operate and provide needed information to the UAS, such as location and wind field predictions. The information provided by these neural networks can be fed into an additional system that can use the data to calculate a safe and accurate trajectory for the UAS to follow in reaching its destination, even if GNSS or other external communication is unavailable or fails.
GNSS reception in an urban environment is often unavailable or inaccurate in urban canyons due to (1) restricted satellite view, shadowing, or multi-path reflections; and (2) unstable network communication, or limited or restricted bandwidth. Further, urban cityscape geometry is dynamic, imprecise, and complex. To overcome these challenges, it would be advantageous to have a system for determining location of a UAS that avoids the need for GNSS location determination. Embodiments of the present invention have this advantage.
Components of the system include onboard sensors that can scan the periphery of the UAS to produce imagery of the current surroundings; one or more machine learning algorithms, i.e., neural networks, that take the scan information and produce kinematics predictions, such as predictions of the position, velocity, acceleration, and orientation of the UAS; and computing hardware to perform the real-time kinematics calculations onboard the UAS.
At least one neural network on the UAS is dedicated to giving accurate information on location. To create an accurate trajectory adviser, it is important for the UAS to have accurate information on its location in order to reach its final destination. Previously, this information is provided by a GNSS system, or by other network connection with a dead reckoning component if the GNSS fails. Present embodiments train a neural network (NN), using sensor imagery of the environs and correlated positions captured during previous UAS flights or traverses of synthetic city/urban geometries. Illustrative sensors that may be used according to present embodiments include cameras and LIDAR sensors. In some illustrative embodiments, the sensors are LIDAR sensors, which capture a 360 degree view around the UAS with a 30 degree angle above and below the horizontal, but it will be appreciated that a number of sensor configurations can be used so long as the training input for the neural network can be adjusted to fit the appropriate sensor configuration.
Providing Onboard Wind Prediction
In addition to the neural network providing location information, one or more of the neural networks on the UAS may be dedicated to giving accurate information on wind fields. Strong winds can disturb the motion of the UAS and complicate obstacle avoidance and collision avoidance. Therefore, providing wind predictions may be an important part of any algorithm for trajectory decision-making. As accurate wind field data, GNSS, and network connections are often unavailable in a complex urban environment, a method and system are needed to provide estimates of the wind field onboard the UAS without the need to access these data from a network during flight. In some embodiments, an untrained neural network is trained using sensor imagery of the environs and correlated wind fields. In initial stages, the wind fields are provided via a simulation based around the urban geometry and around known wind fields of the location where the UAS is to operate. In later stages, as actual wind field data become more available, such data can be provided by data from sensors located throughout the city. These sensors could be used to generate more accurate wind predictions, which could then be used to fine tune and further train the neural network. In some embodiments, the sensors are LIDAR sensors, which capture a 360 degree view around the UAS with a 30 degree angle above and below the horizontal, but it will be appreciated that a number of sensor configurations can be used so long as the training input for the neural network can be adjusted to fit the appropriate sensor configuration.
Components of some embodiments of the invention include onboard sensors that can scan the periphery of the UAS to produce imagery of the current surroundings, a machine learning module that take the scan information and produce predictions of the wind field based on prior calculations, and a wind field module to perform the real-time wind field calculations onboard the UAS.
Once the sensor data and wind field data are provided, the neural network 403 is then trained by machine learning module 404. Machine learning module 104 can use a number of effective machine learning techniques, including but not limited to TensorFlow, Theano, Torch, Caffe, CNTK, or CuDNN. In some embodiments, machine learning module 404 establishes the relationship between the images and the corresponding wind field prediction around the UAS by establishing the proper weights and biases for each LIDAR image and each wind field. The trained neural network 405 is then uploaded onto the UAS before flight.
In some embodiments, a single neural network is trained to determine a wind field based on multiple wind directions or multiple wind speeds. The neural network is then trained to create different wind predictions based on the speed of the wind at a given location. The neural network then obtains these data to create more accurate wind field predictions, or the UAV could be equipped with communication devices to receive broadcasts of wind speed from a set of established local measurement devices. These data could be constantly used to refine and improve the wind field calculations.
It will be appreciated that wind field calculations could also be used in other situations besides the guidance of a UAS. Wind field predictions could be used for emergency responders in cases of airborne hazards such as smoke, ash, or toxic plumes. Wind field predictions based on neural networks could be used for calculations onboard ships, and for precision parachute airdrops, as well as wind predictions at urban airports, and improved wind predictions in mountain valleys, canyons, and indoors.
It will also be appreciated that while certain illustrative embodiments of the invention focus on predicting the wind field, as it is an important part of helping a UAS safely fly, the same approach may be used to make predictions of temperature, pressure, or other atmospheric quantities to improve the safety of the UAS and provide additional information to the program deciding the trajectory of the UAS. Temperature and pressure variations can affect the system, especially when altitude measurements are concerned. In making such estimates, a neural network may be trained to establish the proper weights and biases for location and temperature and/or pressure, presumably given certain starting conditions, then the neural network may be loaded on the UAS and used to make predictions of likely pressure and temperature, which may be used by the trajectory program to augment its calculation of a proper trajectory.
Applications
Multiple shopping and delivery companies have expressed interest in the delivery of packages by UAS aircraft, including Amazon.com, Google, DHL, FedEx, USPS, and UPS. Present embodiments would improve the safety of flight for package delivery, surveillance, and observation tasks, especially in complex environments. The system would be valuable to urban planners, architects, and civil engineers who need to make estimates of urban wind fields for pedestrian comfort and safety, understanding winds and pressure loadings on buildings, and airborne pollution dispersal. The system would also be valuable to urban first responders, such as firefighters, police officers, and public safety officials, in predicting local wind fields with respect to dispersal of ash, sparks, smoke, toxic gas plumes, and the like.
Principles described herein can also be easily extended to provide in-flight prediction of safe landing zones for helicopters and V/STOL operations, both on land and on ships at sea. These principles can also be used for predictions of forward and reverse bullet trajectories.
In an illustrative embodiment of the invention, a method for providing real-time trajectory input to an unmanned aerial system comprises:
-
- training a first neural network with pre-existing landscape data in a selected flight area with an appropriate location associated with the landscape data;
- training a second neural network with pre-existing landscape data in the selected flight area with pre-existing wind prediction data for the selected flight area;
- uploading the first trained neural network and the second trained neural network to at least one computer onboard the unmanned aerial system;
- flying the unmanned aerial system;
- scanning the peripheral environment around the unmanned aerial system to produce imagery of the unmanned aerial system's current surroundings;
- calculating actual positions of the unmanned aerial system using the first trained neural network and the scanned imagery;
- calculating the predicted wind vectors using the second trained neural network and the scanned imagery; and
- providing the predicted wind vectors and actual position and kinematics information to a trajectory calculator.
Illustratively, the pre-existing landscape data may comprise a synthetic representation of the landscape, scanned images from a previous flight of the unmanned aerial system, or a combination thereof. Similarly, the pre-existing wind prediction data may comprise computer simulation data, calculations from a previous flight of the unmanned aerial system, or a combination thereof. Further, scanning the peripheral environment around the unmanned aerial system may be performed with a camera or LIDAR onboard the unmanned aerial system. Still further, the trajectory calculator may be a computer program or a neural network.
By way of further illustration, the method may further comprise:
-
- training at least a third neural network to associate pre-existing landscape data in the selected flight area of the unmanned aerial system with pre-existing wind prediction data for the selected flight area of the unmanned aerial system, wherein the second and at least third neural networks are associated with specific weather conditions; and
- checking a weather forecast before takeoff, allowing the unmanned aerial system to select from the second and at least third neural networks to calculate a wind prediction given the weather forecast.
The unmanned aerial system can also communicate with established wind sensors throughout the flight to update which neural network is being used for wind predictions.
The first neural network can be trained in-flight using past sensor data and location data collected by the unmanned aerial system. The second neural network may be trained using multiple wind directions, and the unmanned aerial system may be equipped with an apparatus for determining its orientation with respect to the oncoming winds. These data may also be inferred by the neural network from pre-flight weather forecasts. Further, the second neural network may be trained using multiple wind speeds, and the trained neural network receives data on the current wind speed to create a prediction of the wind field based on wind speed and urban geometry.
Another illustrative embodiment of the invention comprises a system for providing real-time trajectory input to an unmanned aerial system, the system comprising: a trajectory calculator;
-
- a computer disposed on the unmanned aerial system, the computer comprising a processor and a memory;
- at least one sensor disposed on the unmanned aerial system and configured for scanning the peripheral environment around the unmanned aerial system to produce imagery of the unmanned aerial system's surroundings;
- a first neural network trained with pre-existing landscape data in a selected flight area with an appropriate location associated with the landscape data, the first neural network stored in the memory;
- a second neural network trained with pre-existing landscape data in the selected flight area with pre-existing wind prediction data for the selected flight area, the second neural network stored in the memory;
- computer-readable instructions stored in the memory that, when executed by the processor when the unmanned aerial system is in flight, cause the processor to:
- scan the peripheral environment around the unmanned aerial system to produce imagery of the unmanned aerial system's current surroundings;
- calculate actual positions of the unmanned aerial system using the first trained neural network and the scanned imagery;
- calculate the predicted wind vectors using the second trained neural network and the scanned imagery; and
- provide the predicted wind vectors and actual position to the trajectory calculator.
Illustratively, the pre-existing landscape data may comprise a synthetic representation of the landscape, scanned images from a previous flight of the unmanned aerial system, or a combination thereof. Similarly, the pre-existing wind prediction data may comprise computer simulation data, calculations from a previous flight of the unmanned aerial system, or a combination thereof. Further, scanning the peripheral environment around the unmanned aerial system may be performed with a camera or LIDAR onboard the unmanned aerial system. Still further, the trajectory calculator may be a computer program or a neural network.
By way of further illustration, the method may further comprise:
-
- training at least a third neural network to associate pre-existing landscape data in the selected flight area of the unmanned aerial system with pre-existing wind prediction data for the selected flight area of the unmanned aerial system, wherein the second and at least third neural networks are associated with specific weather conditions; and
- checking a weather forecast before takeoff, allowing the unmanned aerial system to select from the second and at least third neural networks to calculate a wind prediction given the weather forecast.
The unmanned aerial system can also communicate with established wind sensors throughout the flight to update which neural network is being used for wind predictions.
The first neural network can be trained in-flight using past sensor data and location data collected by the unmanned aerial system. The second neural network may be trained using multiple wind directions, and the unmanned aerial system may be equipped with an apparatus for determining its orientation with respect to the oncoming winds. Further, the second neural network may be trained using multiple wind speeds, and the trained neural network receives data on the current wind speed to create a prediction of the wind field based on wind speed and urban geometry.
Still another illustrative embodiment of the invention comprises a method for providing location data in real-time to an unmanned aerial system, the method comprising:
-
- training a first neural network with pre-existing landscape data in a selected flight area with an appropriate location associated with the landscape data;
- uploading the first trained neural network to at least one computer onboard the unmanned aerial system;
- flying the unmanned aerial system;
- scanning the peripheral environment around the unmanned aerial system to produce imagery of the unmanned aerial system's current surroundings;
- calculating actual positions of the unmanned aerial system using the first trained neural network and the scanned imagery;
- providing the actual positions to the at least one computer.
This illustrative embodiment may further comprise: - training a second neural network with pre-existing landscape data in the selected flight area with pre-existing wind prediction data for the selected flight area;
- uploading the second trained neural network to at least one computer onboard the unmanned aerial system; and
- calculating the predicted wind vectors using the second trained neural network and the scanned imagery.
Yet another illustrative embodiment of the invention comprises a method for providing wind vectors to an unmanned aerial system, the method comprising:
-
- training a second neural network with pre-existing landscape data in the selected flight area with pre-existing wind prediction data for the selected flight area;
- uploading the second trained neural network to at least one computer onboard the unmanned aerial system;
- flying the unmanned aerial system;
- scanning the peripheral environment around the unmanned aerial system to produce imagery of the unmanned aerial system's current surroundings;
- calculating the predicted wind vectors using the second trained neural network and the scanned imagery; and
- providing the predicted wind vectors and actual position to a trajectory calculator.
This illustrative embodiment may further comprise: - training a first neural network with pre-existing landscape data in a selected flight area with an appropriate location associated with the landscape data;
- uploading the first trained neural network to at least one computer onboard the unmanned aerial system; and
- calculating actual positions of the unmanned aerial system using the first trained neural network and the scanned imagery.
A still further illustrative embodiment of the invention comprises a method of using a neural network for performing real-time wind predictions for unmanned aerial systems operating in dynamic environments, the method comprising:
-
- training a neural network with pre-existing wind prediction data;
- uploading the neural network data to a computer onboard an unmanned aerial system; flying the unmanned aerial system;
- scanning the peripheral environment around the unmanned aerial system to produce imagery of the unmanned aerial system's current surroundings;
- calculating wind vectors using the scanned imagery, wherein the calculated wind vectors are determined without the use of GNSS, cellular data, WiFi, or other external network communication; and
- training the neural network with the calculated wind vectors.
The pre-existing wind prediction data may be obtained from a computer simulation, a previous flight of an unmanned aerial system, or a combination thereof. Scanning the peripheral environment around the unmanned aerial system may be performed with a camera, LIDAR, or both onboard the unmanned aerial system.
Still another illustrative embodiment of the invention comprises a method of using a neural network for determining the position of an unmanned aerial system operating in a dynamic environment, the method comprising:
-
- training a neural network with pre-existing landscape data along an intended flight route of the unmanned aerial system;
- uploading the neural network pre-existing landscape data to a computer onboard the unmanned aerial system;
- flying the unmanned aerial system along the intended flight route;
- scanning a peripheral environment around the unmanned aerial system to produce imagery of the unmanned aerial system's current surroundings;
- calculating actual positions of the unmanned aerial system using the scanned imagery and the pre-existing landscape data, wherein the calculated actual positions are determined without the use of GNSS, cellular data, WiFi, or other external network communication; and
- correcting the position of the unmanned aerial system when the calculated actual position is not along the intended flight route.
The pre-existing landscape data may include a synthetic representation of the landscape, scanned images from previous flights of an unmanned aerial system, or a combination thereof. Scanning the peripheral environment around the unmanned aerial system may be performed with a camera, LIDAR, or both onboard the unmanned aerial system.
In the foregoing Detailed Description, various features of the present disclosure are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed disclosure requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the following claims are hereby incorporated into this Detailed Description of the Disclosure by this reference, with each claim standing on its own as a separate embodiment of the present disclosure.
It is to be understood that the above-described arrangements are only illustrative of the application of the principles of the present disclosure. Numerous modifications and alternative arrangements may be devised by those skilled in the art without departing from the spirit and scope of the present disclosure and the appended claims are intended to cover such modifications and arrangements. Thus, while the present disclosure has been shown in the drawings and described above with particularity and detail, it will be apparent to those of ordinary skill in the art that numerous modifications, including, but not limited to, variations in size, materials, shape, form, function and manner of operation, assembly and use may be made without departing from the principles and concepts set forth herein.
Claims
1. An improved method for determining real-time trajectory for an unmanned aerial system, the method comprising:
- training a first neural network, said training including providing a machine learning module with pre-existing landscape data in a selected flight area with location data associated with the landscape data to train the first neural network;
- training a second neural network, said training including providing the machine learning module with the pre-existing landscape data in the selected flight area with pre-existing wind prediction data for the selected flight area to train the second neural network;
- uploading the first neural network and the second neural network to at least one computer on board the unmanned aerial system;
- while being flown along an intended route, scanning, by a sensor on board the unmanned aerial system, a peripheral environment around the unmanned aerial system to produce imagery of the unmanned aerial system's current surroundings;
- determining, by the at least one on board computer, actual positions of the unmanned aerial system using the first trained neural network and the imagery;
- determining, by the at least one on board computer, predicted wind vectors using the second trained neural network and the imagery; and
- providing the predicted wind vectors and actual position to a trajectory module to create or modify the trajectory of the unmanned aerial system.
2. The method of claim 1, wherein the pre-existing landscape data include a synthetic representation of the landscape.
3. The method of claim 1, wherein the pre-existing landscape data include scanned images from a previous flight of the unmanned aerial system.
4. The method of claim 1, wherein the pre-existing wind prediction data are from a computer simulation.
5. The method of claim 1, wherein the pre-existing wind prediction data are from calculations from a previous flight of the unmanned aerial system.
6. The method of claim 1, wherein scanning the peripheral environment around the unmanned aerial system is performed with a camera onboard the unmanned aerial system.
7. The method of claim 1, wherein scanning the peripheral environment around the unmanned aerial system is performed with LIDAR onboard the unmanned aerial system.
8. The method of claim 1, wherein the trajectory module is a neural network.
9. The method of claim 1, further comprising:
- training at least a third neural network, said training including providing the machine learning module with the pre-existing landscape data in the selected flight area of the unmanned aerial system and the pre-existing wind prediction data for the selected flight area of the unmanned aerial system, wherein the second and at least third neural networks are associated with specific weather conditions; and
- checking a weather forecast before takeoff, allowing the unmanned aerial system to select from the second and at least third neural networks to calculate a wind prediction given the weather forecast.
10. The method of claim 9, wherein the unmanned aerial system is configured to communicate with established wind sensors throughout the flight to update which neural network is being used for wind predictions.
11. The method of claim 1, wherein the first neural network is trained in-flight using past sensor data and location data collected by the unmanned aerial system.
12. The method of claim 1, wherein the second neural network is trained using multiple wind directions, and the unmanned aerial system is configured for determining its orientation with respect to oncoming winds.
13. The method of claim 1, wherein the second neural network is trained using multiple wind speeds, and the second neural network receives data on current wind speed to create a prediction of wind field based on wind speed and urban geometry.
14. A system for providing real-time trajectory input to an unmanned aerial system, the system comprising:
- a trajectory module;
- a computer disposed on the unmanned aerial system, the computer comprising a processor and a memory;
- at least one sensor disposed on the unmanned aerial system and configured for scanning a peripheral environment around the unmanned aerial system to produce imagery of the unmanned aerial system's current surroundings;
- a first neural network trained with pre-existing landscape data in a selected flight area with location data associated with the landscape data, the first neural network stored in the memory;
- a second neural network trained with the pre-existing landscape data in the selected flight area with pre-existing wind prediction data for the selected flight area, the second neural network stored in the memory;
- computer-readable instructions stored in the memory that, when executed by the processor when the unmanned aerial system is in flight, cause the processor to: scan the peripheral environment around the unmanned aerial system to produce the imagery of the unmanned aerial system's current surroundings; determine actual positions of the unmanned aerial system using the first trained neural network and the imagery; determine the predicted wind vectors using the second trained neural network and the imagery; and provide the predicted wind vectors and actual position to the trajectory module to create or modify the trajectory of the unmanned aerial system.
15. The system of claim 14, wherein the trajectory module is a neural network.
16. The system of claim 14, wherein scanning the peripheral environment around the unmanned aerial system is performed with a camera onboard the unmanned aerial system.
17. The system of claim 14, wherein scanning the peripheral environment around the unmanned aerial system is performed with LIDAR onboard the unmanned aerial system.
18. The system of claim 14, wherein the pre-existing landscape data include a synthetic representation of the landscape, scanned images from a previous flight of the unmanned aerial system, or a combination thereof.
19. The system of claim 14, wherein the pre-existing wind prediction data are from a computer simulation, a previous flight of the unmanned aerial system, or a combination thereof.
8649632 | February 11, 2014 | Neophytou |
9618934 | April 11, 2017 | Deroos |
10217207 | February 26, 2019 | Marra |
10618673 | April 14, 2020 | Chan |
20090037091 | February 5, 2009 | Bolt, Jr. |
20110295569 | December 1, 2011 | Hamke |
20140046510 | February 13, 2014 | Randolph |
20150347872 | December 3, 2015 | Taylor |
20150379408 | December 31, 2015 | Kapoor |
20170031369 | February 2, 2017 | Liu |
20170305546 | October 26, 2017 | Ni |
20180158197 | June 7, 2018 | Dasgupta |
20190101934 | April 4, 2019 | Tuukkanen |
20190147753 | May 16, 2019 | Hendrian |
20190204093 | July 4, 2019 | Cantrell |
20190271563 | September 5, 2019 | Pandit |
20190346269 | November 14, 2019 | Mohr |
20200103552 | April 2, 2020 | Phelan |
20200130830 | April 30, 2020 | Dong |
- Courbon, J. et al. “Vision-based navigation of unmanned aerial vehicles”; Control Engineering Practice 18 (2010) pp. 789-799 (11 pages).
- Smolyanskiy, N. et al. “Toward Low-Flying Autonomous MAV Trail Navigation using Deep Neural Networks for Environmental Awareness”; arXiv:1705.02550v1; Jul. 2010 (7 pages).
- Riisgaard, S. et al. “SLAM for Dummies: A Tutorial Approach to Simultaneous Localization and Mapping” (127 pages).
- Duggal, V. et al. “Hierarchical Structured Learning for Indoor Autonomous Navigation of Quadcopter”; Proceedings of the Tenth Indian Conference on Computer Vision, Graphics and Image Processing; Dec. 18-22, 2016; Guwahati, India (8 pages).
- Maturana, D. et al. “3D Convolutional Neural Networks for Landing Zone Detection from LiDAR”; 2015 IEEE International Conference on Robotics and Automation (ICRA); May 26-30, 2015; Seattle, Washington; pp. 3471-3478 (8 pages).
- Loquercio, A. et al. “DroNet: Learning to Fly by Driving”; IEEE Robotics and Automation Letters; Jan. 2018 (8 pages).
Type: Grant
Filed: Apr 17, 2018
Date of Patent: Jun 29, 2021
Assignee: United States of America as Represented by the Administrator of NASA (Washington, DC)
Inventors: John Earl Melton (Hollister, CA), Ben Edward Nikaido (Gilroy, CA)
Primary Examiner: Joshua E Rodden
Application Number: 15/955,661
International Classification: B64C 39/02 (20060101); G05D 1/00 (20060101); G05D 1/10 (20060101); G08G 5/00 (20060101); G01S 17/89 (20200101); G06N 3/04 (20060101); G06N 3/08 (20060101); H04N 7/18 (20060101); G06T 7/70 (20170101);