APPARATUS, METHOD AND SOFTWARE FOR ASSISTING HUMAN OPERATOR IN FLYING DRONE USING REMOTE CONTROLLER
Apparatus, method, and software for assisting human operator in flying drone using remote controller. The apparatus includes an internal data communication interface configured to receive data from the remote controller, an augmented reality display configured to display the data, one or more memories including computer program code, and one or more processors to cause the apparatus to: superimpose, on the augmented reality display, a target symbol indicating a position of the drone while the human operator is looking towards the drone; superimpose, on the augmented reality display, an orientation symbol indicating an orientation of the drone while the human operator is looking towards the drone; obtain a geographic location related to the drone; and set a world marker on the obtained geographic location.
Various embodiments relate to an apparatus for assisting a human operator in flying a drone using a remote controller, a method for assisting the human operator in flying the drone using the remote controller, and a computer-readable medium comprising computer program code, which, when executed by one or more processors, causes performance of the method.
BACKGROUNDA (ground-based) human operator flies a drone (or an unmanned aerial vehicle, UAV) using a remote controller (sometimes at least partly assisted by an autopilot).
The human operator has to simultaneously look towards the drone in the air, operate the hand-held remote controller, and occasionally look towards a display of the remote controller. This leads to poor situational awareness, causing potentially hazardous situations.
A legal requirement is that the human operator must maintain a visual contact (by a line of sight) to the drone in the air. This is quite challenging as the drone may not be visible due to a long distance, low ambient light, or a physical obstacle, for example.
These problems may be mitigated by another person, a so-called spotter, retaining the visual contact to the drone, even using binoculars, whereas the human operator may concentrate on operating the remote controller (but may still need to check occasionally the display of the remote controller). Naturally, such a setup requires good communication skills for the human operator and the spotter. Additionally, the manual labour is doubled, leading to higher operation costs for the drone.
US 2018/0196425 A1, US 2019/0077504 A1, and US 2019/0049949 A1 disclose various aspects related to the use of a head-mounted display in flying the drone.
BRIEF DESCRIPTIONAccording to an aspect, subject matter of independent claims is provided. Dependent claims define some embodiments.
One or more examples of implementations are set forth in more detail in the accompanying drawings and the description of embodiments.
Some embodiments will now be described with reference to the accompanying drawings, in which
The following embodiments are only examples. Although the specification may refer to “an” embodiment in several locations, this does not necessarily mean that each such reference is to the same embodiment(s), or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other embodiments.
Furthermore, words “comprising” and “including” should be understood as not limiting the described embodiments to consist of only those features that have been mentioned and such embodiments may contain also features/structures that have not been specifically mentioned.
Reference numbers, both in the description of the embodiments and in the claims, serve to illustrate the embodiments with reference to the drawings, without limiting it to these examples only.
The embodiments and features, if any, disclosed in the following description that do not fall under the scope of the independent claims are to be interpreted as examples useful for understanding various embodiments of the invention.
Let us study
Simultaneously,
The method starts in 3000 and ends in 3010. Note that the method may run as long as required (after the start-up of the apparatus 100 until switching off) by looping back to an operation 3002.
The operations are not strictly in chronological order in
The apparatus 100 comprises an internal data communication interface 108 configured to receive 3002 data related to the flying from the remote controller 150. The data related to the flying may include telemetry data of the drone 160. The data related to the flying may include, but is not limited to: sensor readings such as gyroscope and magnetometer, angular rate, velocity, fusion data such as altitude and global position, aircraft information such as battery, gimbal, and flight status, etc. Note that depending on the drone environment, some data may also be received by the apparatus 100 directly from the drone 160.
The internal data communication interface 108 may be implemented using a wireless radio transceiver configured to communicate with a wireless transceiver of the remote controller 150. The technologies for the internal data communication interface 108 include, but are not limited to one or more of the following: a wireless local area network (WLAN) implemented using an IEEE 802.11ac standard or a Wi-Fi protocol suite, a short-range radio network such as Bluetooth or Bluetooth LE (Low Energy), a cellular radio network employing a subscriber identity module (SIM) or an eSIM (embedded SIM), or another standard or proprietary wireless connectivity means. Note that in some use cases, the internal data communication interface 108 may additionally or alternatively utilize a standard or proprietary wired connection such as an applicable bus, for example. An embodiment utilizes a wired connection according to the USB (Universal Serial Bus) standard.
The apparatus 100 also comprises an augmented reality (AR) display 112 configured to display 3004 the data related to the flying to the human operator 120. Note that the drawings from
In the drawings, the augmented reality display 112 is implemented as a head-mounted display attached with a headband (or being a helmet-mounted) and worn as a visor in front of the eyes by the human operator 120. In the drawings, the augmented reality display 112 is implemented as a see through display on which holographic images are displayed. In an alternative embodiment, the augmented reality display 112 may employ cameras to intercept the real world view and display an augmented view of the real world as a projection.
In an embodiment, the apparatus 100 is implemented using Microsoft® HoloLens® 2 (or a later version) mixed reality smartglasses employing see-through holographic lenses as the augmented reality display 112, offering a complete development environment. The head-mounted apparatus 100 then includes the necessary processors (including a system on a chip, a custom-made holographic processing unit, and a coprocessor) 102, memories 104 and software 106, a depth camera, a video camera, projection lenses, an inertial measurement unit (including an accelerometer, a gyroscope, and a magnetometer), a wireless connectivity unit 108, 110, and a rechargeable battery. Note that some of these parts are not illustrated in
However, also other applicable implementations of the augmented reality display 112 may be used, including, but not limited to: eyeglasses, a head-up display, contact lenses with an augmented reality imaging, etc. For the purposes of the present embodiments, the augmented reality display 112 is configured to provide an interactive real-time experience of a real-world flying environment 210 and the drone 160 enhanced by computer-generated perceptual information. The data related to the flying is superimposed (or overlaid) in addition to the natural environment 210 and the drone 160.
The apparatus 100 also comprises one or more memories 104 including computer program code 106, and one or more processors 102 configured to execute the computer program code 106 to cause the apparatus 100 to perform required data processing. The data processing performed by the apparatus 100 may be construed as a method or an algorithm 130.
The term ‘processor’ 102 refers to a device that is capable of processing data. In an embodiment, the processor 102 is implemented as a microprocessor implementing functions of a central processing unit (CPU) on an integrated circuit. The CPU is a logic machine executing the computer program code 106. The CPU may comprise a set of registers, an arithmetic logic unit (ALU), and a control unit (CU). The control unit is controlled by a sequence of the computer program code 106 transferred to the CPU from the (working) memory 104. The control unit may contain a number of microinstructions for basic operations. The implementation of the microinstructions may vary, depending on the CPU design. The one or more processors 102 may be implemented as cores of a single processors and/or as separate processors.
The term ‘memory’ 104 refers to a device that is capable of storing data run-time (=working memory) or permanently (=non-volatile memory). The working memory and the non-volatile memory may be implemented by a random-access memory (RAM), dynamic RAM (DRAM), static RAM (SRAM), a flash memory, a solid state disk (SSD), PROM (programmable read-only memory), a suitable semiconductor, or any other means of implementing an electrical computer memory.
The computer program code 106 is implemented by software. In an embodiment, the software may be written by a suitable programming language, and the resulting executable code may be stored in the memory 104 and executed by the one or more processors 102.
The computer program code 106 implements the method/algorithm 130. The computer program code 102 may be coded as a computer program (or software) using a programming language, which may be a high-level programming language, such as C, C++, or Rust, for example. The computer program code 106 may be in source code form, object code form, executable file, or in some intermediate form, but for use in the one or more processors 102 it is in an executable form as an application 140. There are many ways to structure the computer program code 106: the operations may be divided into modules, sub-routines, methods, classes, objects, applets, macros, etc., depending on the software design methodology and the programming language used. In modern programming environments, there are software libraries, i.e., compilations of ready-made functions, which may be utilized by the computer program code 106 for performing a wide variety of standard operations. In addition, an operating system (such as a general-purpose operating system) may provide the computer program code 106 with system services.
An embodiment provides a computer-readable medium 170 storing the computer program code 106, which, when loaded into the one or more processors 102 and executed by one or more processors 102, causes the one or more processors 102 to perform the method/algorithm 130 described in
As shown in
However, as illustrated in
Let us examine
This convention is used in all drawings from
Note that in all described embodiments, the human operator 120 is standing on the ground 500, and the drone 160 is flying in the air 210. However, the embodiments are also applicable to other kind of environments, such as flying the drone 160 in an underground cave, inside a man-made structure (such as a building or a tunnel), or even in such use cases where the drone 160 is flying below the human operator 120, i.e., the human operator 120, while looking 204 towards the drone 160, is looking down and not up. In such a use case, the human operator 120 may be standing on a high platform (such as a skyscraper or a mountain), and the drone 160 is flying below (such as above the streets or in a valley). The embodiments may also be applied to flying the drone 160 submersed, i.e., the drone 160 is then an unmanned underwater vehicle (UUV), and the human operator 120 may operate the drone 160 from the land or from a vessel, for example, while the drone is underwater in a river, lake, sea, water-filled mine or tunnel, etc.
In a way, all drawings from
Let us now return to
The use of the augmented reality display 112 enables the human operator 120 to look 204 towards the drone 160 in the sky 210 during the flying. This improves the situational awareness of the human operator 120 regarding the flying, without needing the spotter. The human operator maintains a visual contact (by a line of sight) to the drone 160 in the air 210, but is also simultaneously shown aviation data in actual correct world positions as will be explained.
The target symbol 200 indicates the position of the drone 160 in the air 210, which makes it easier for the human operator 120 to track the drone 160 during the flying. In an embodiment, the target symbol 200 is a reticle as illustrated. The reticle 200 is commonly used in a telescopic sight of a firearm. The reticle 200 may include a combination of a circle 300 and a partial crosshair 302 as shown in
The orientation symbol 202 indicates the orientation of the drone 160 in the air 210, which makes it easier for the human operator 120 to understand an effect of the steering commands given with the remote controller 150 to the drone 160 during the flying. In an embodiment, the orientation symbol 202 is an arrow as illustrated. As shown in
In the augmented reality display 112, the target symbol 200 and the orientation symbol 202 from the digital world blend into the human operator's 120 perception of the real world, through the integration of immersive sensations, which are perceived as natural parts of the flying environment 210. Let us next study
In an embodiment, the orientation symbol 202 is configured to point out a predetermined direction fixed in relation to the orientation of the drone 160 in the air 210. As the human operator 120 is aware of the predetermined direction, it is easy to for the human operator 120 to understand the way the steering commands given with the remote controller 150 influence the flying. As shown in
In an embodiment illustrated in
In an embodiment illustrated in
-
- obtaining a position of the drone 160 on the ground 500 in the world coordinate system 502;
- obtaining a position of the drone 160 on the ground 500 in the augmented reality coordinate system 504 of the apparatus 100;
- locking the position of the drone 160 in the augmented reality coordinate system 504 with the position of the drone 160 in the world coordinate system 502;
- obtaining the heading 400 of the drone 160 on the ground 500; and
- setting the heading 400 as an orientation of a calibration heading symbol in the augmented reality coordinate system 504 of the apparatus 100.
In this way, the augmented reality coordinate system 504 that constantly tracks any movement of the head of the human operator 120, is now firmly based in the world coordinates 502, and also follows the actual compass directions 602. The coupling of world latitude and longitude (x and z of the world coordinate system 502) and the compass heading information 602 into the augmented reality presentation is thus achieved.
In a more specific embodiment, the apparatus 100 is caused to perform:
-
- obtaining (from the remote controller 150 or from the drone 160), the position of the drone 160 on the ground 500 in the world coordinate system 502;
- superimposing, on the augmented reality display 112, a calibration position symbol;
- receiving a first user acknowledgment after the calibration position symbol is placed on the drone 160 (such as on a centre of the drone 160 or on another predetermined point on the drone 160);
- obtaining (from the augmented reality engine 144), the position of the drone 160 on the ground 500 in the augmented reality coordinate system 504 of the apparatus 100;
- locking the position of the drone 160 in the augmented reality coordinate system 504 with the position of the drone 160 in the world coordinate system 502;
- obtaining (from the remote controller 150 or from the drone 160), the heading 400 of the drone 160 on the ground 500;
- superimposing, on the augmented reality display 112, a calibration orientation symbol;
- receiving a second user acknowledgment after the calibration orientation symbol is aligned with the drone 160 (such as with a tail-nose line of the drone 160 or with another predetermined orientation of the drone 160); and
- setting the heading 400 as the orientation of the calibration orientation symbol in the augmented reality coordinate system 504 of the apparatus 100.
At first, the augmented reality system is shown the position of the drone 160 in the world coordinate system 502, and the position of the drone 160 in relation to the augmented reality coordinate system 504. By indicating that the drone 160 centre is situated in this exact spot within the augmented reality field of view 112, with augmented reality indicators, that spot is now known both in the real world coordinate system 502 and in the augmented reality system coordinates 504. With this combination, a fixed common position with the world latitude and longitude information is obtained. This latitude and longitude comes from the 160 drone, as it knows at this moment its exact coordinates (provided by GPS or another global navigation satellite system, or by another positioning technology such as a cellular radio-based positioning). An augmented reality pointer stick, or another type of the calibration position symbol may indicate a position in the augmented reality display 112 for the human operator 120. When showing the drone 160 location, this stick, which moves at a fixed distance in front of the human operator 160 and points down, is guided to be on top of the centre of the drone 160. It is held steady to confirm the position, which then locks the coordinate systems 502, 504 together. Alternatively, this may also be done using a machine vision, just seeing the drone 160 and deciphering its place in the augmented reality coordinate system 504, then locking the drone 160 latitude, longitude and even heading into that shape. Showing the position of the drone 160 may be done in many ways, but it needs to be done with confidence to lock the world and augmented reality coordinate systems 502, 504 reliably together.
Secondly, as the drone 160 knows where its nose is pointed at, i.e., the drone 160 tells its compass heading in degrees, this may be used to finalize the coupling of the coordinate systems 502, 504. The augmented reality system is used to align a displayed line or another type of the calibration orientation symbol with a tail-nose-line of the drone 160, and when this is achieved, this compass orientation of the displayed line in the world coordinate system 502 is now known. Thus, the world compass heading of any direction, for example North may be calculated from it.
As an optional step, at the time when the world position (latitude, longitude) is obtained from the drone 160, an exact altitude (y in the world coordinate system 502) may also be queried from a map system based on the exact world coordinates 502, or from the drone 160 itself, possibly via the remote controller 150. So, we may also calibrate an altitude for this point in space (with a drone-specific offset of the top surface of the drone 160 from the ground 500, if an exact precision is needed), and so use the map data to accurately determine any other world point terrain altitude from here on. To summarize, the latitude, the longitude, possibly the altitude, and the compass heading may be needed for the world locking to be achieved.
After this coupling, everything else in the whole system is built around the knowledge of where the drone 160 actually is in the world coordinates 502 and what is around it exactly there in the world. Note that the described embodiments related to the coupling may operate as stand-alone embodiments, irrespective of all other embodiments, also those described in relation to the independent claims and other dependent claims.
The data related to the flying is mapped to the world coordinates 502, and is consequently displayed 3004, 3006, 3008 so that its visualization takes advantage of knowing its three-dimensional position expressed in the world coordinate system 502, which is locked to the augmented reality coordinate system 504.
In an embodiment, illustrated in
In an embodiment illustrated in
Let us next study
Let us next study
Next,
As shown in
As shown in
As shown in
As shown in
In an embodiment of
In an embodiment of
In an embodiment, the apparatus 100 is caused to superimpose, on the augmented reality display 112, the data related to the flying while the human operator 120 is looking 204 towards the drone 160 in the air 210 with an augmented line of sight to the drone 160 during a long-distance visibility. This is not shown in any drawing, but basically the drone 160 is then high up in the sky, or near the horizon, for example, and the human operator 120 is guided to look at the right direction with the target symbol 200, whereby the human operator 120 may only see the drone 160 as a tiny object in the distance.
In an embodiment illustrated in
Let us finally study
A first apparatus 100 is used for assisting a first human operator 120 in flying the drone 160 in the air using 210 the remote controller 150.
A first geographic location 2814 of the first human operator 120 in relation to the position of the drone 160 in the air 210 is used to adjust a first viewpoint for rendering the data related to the flying including a first target symbol 200 and a first orientation symbol 202 to be superimposed on a first augmented reality display 112 of the first apparatus 100.
As illustrated in
A second geographic location 2804 of the second human operator 2802 in relation to the position of the drone 160 in the air 210 is used to adjust a second viewpoint for rendering the data related to the flying including a second target symbol 2806 and a second orientation symbol 2808 to be superimposed on a second augmented reality display 2810 of the second apparatus 2800.
In this way, the second human operator 2802 may at least observe 2812 the flying of the drone 160 in the air 210. This may be useful just for fun, for educational purposes, for passing a test for a flying license, for surveillance, for tracking a missing person, or even for assisting the first human operator 120, for example. One or both operators 120, 2802 may also be provided with the one or more visual elements based on the data captured in real-time using the one or more sensors 1402 onboard the drone 160 as explained earlier.
In an embodiment illustrated in
For example, if the sensor 1402 is an image sensor as described earlier, the second geographic location 2804 of the second human operator 2802 is used to adjust the second viewpoint for rendering the data related to the flying including also the one or more video feeds captured in real-time from the one or more video cameras 2900 onboard the drone 160 to be superimposed on the second augmented reality display 2810 of the second apparatus 2800. As shown in
Note that the use case of
Note that the scenarios of
Finally, with reference to
First, the apparatus 100 is caused to obtain a geographic location related to the drone 160. Next, the apparatus 100 is caused to set a world marker on the obtained geographic location. The world marker may be set by a user interface operation performed by the human operator 120, or the world marker may be set automatically by the apparatus 100 (for example when a predetermined conditions is met), or semiautomatically, the apparatus 100 suggesting the setting of the world marker in the user interface to the human operator 120 (and the human operator 120 may then either confirm or cancel the world marker). In this way the world marker couples the drone 160 and the geographic location to each other. This coupling may be made in several different ways. As was explained earlier with reference to
However, even when GNSS positioning is not available, like flying inside of a building, a locking relationship between the augmented reality system coordinate system 504 and the drone coordinate system 506 may be formed, and positioning and orienting of the drone 160 is executed in the augmented reality model of the world. If a known model is available for a physical structure in the flying environment of the drone 160, a location expressed in the real world coordinate system 502 of the whole structure may then be mapped to the augmented reality coordinate system 504. Thus, even without GNSS, the 160 drone location in the real world coordinate system 502 is known. Naturally, if indoor positioning information is available, it may be fused to further improve the accuracy of the position and orientation of the drone 160 and its mapping in the augmented reality coordinate system 504.
In an embodiment illustrated in
Note that the world marker 3102 may take various graphical shapes, such as a pole, or another symbol modelling a real world object (a park bench, etc.). The human operator 120 may add a comment (“A rucksack of the missing person”) or some predefined text label in connection with the world marker 3102. The drone 160 may capture image data with onboard sensors 1402 from the area around the world marker 3102 and couple the image data with the world marker 3102 so that the image data may be viewed later by referencing to the world marker 3102.
In an embodiment also illustrated in
In an embodiment illustrated in
In an embodiment illustrated in
In an embodiment, the apparatus 100 uses the earlier explained external data communication interface 110 to communicate data related to the world marker with outside receivers 116. The apparatus 100 is caused to transmit, using the external data communication interface 110, data related to the world marker 3102, 3202, 3312 to the outside receivers 116, such as emergency personnel, surveillance personnel, operation centre, etc.
Even though the invention has been described with reference to one or more embodiments according to the accompanying drawings, it is clear that the invention is not restricted thereto but can be modified in several ways within the scope of the appended claims. All words and expressions should be interpreted broadly, and they are intended to illustrate, not to restrict, the embodiments. It will be obvious to a person skilled in the art that, as technology advances, the inventive concept can be implemented in various ways.
Claims
1. An apparatus for assisting a human operator in flying a drone using a remote controller, comprising:
- an internal data communication interface configured to receive data related to the flying from the remote controller;
- an augmented reality display configured to display the data related to the flying to the human operator;
- one or more memories including computer program code; and
- one or more processors configured to execute the computer program code to cause the apparatus to perform at least the following:
- superimposing, on the augmented reality display, a target symbol indicating a position of the drone while the human operator is looking towards the drone;
- superimposing, on the augmented reality display, an orientation symbol indicating an orientation of the drone while the human operator is looking towards the drone;
- obtaining a geographic location related to the drone; and
- setting a world marker on the obtained geographic location.
2. The apparatus of claim 1, wherein
- obtaining the geographic location related to the drone comprises determining a geographic location of the drone, and
- setting the world marker on the obtained geographic location comprises setting the world marker on the geographic location of the drone.
3. The apparatus of claim 1, wherein
- obtaining the geographic location related to the drone comprises obtaining image data from one or more sensors onboard the drone, receiving a user interface selection from the human operator marking a position shown in the image data, and determining a real world geographic location of the position shown in the image data, and
- setting the world marker on the obtained geographic location comprises setting the world marker on the real world geographic location.
4. The apparatus of claim 1,
- obtaining the geographic location related to the drone comprises obtaining image data from one or more sensors onboard the drone, and defining a view into the image data based on a geographic location and an altitude of the drone, a measuring direction of the one or more sensors, and a view frustum of the one or more sensors,
- setting the world marker on the obtained geographic location comprises setting the view as the world marker, and
- superimposing, on the augmented reality display, a map showing the world marker as the view extending from the geographic location of the drone.
5. The apparatus of claim 1, wherein the apparatus is caused to perform:
- superimposing, on the augmented reality display, a map showing the geographic location of the world marker.
6. The apparatus of claim 5, wherein the map shows also a geographic location of the human operator.
7. The apparatus of claim 1, wherein the apparatus comprises:
- an external data communication interface configured to transmit data related to the flying of the drone;
- wherein the apparatus is caused to perform:
- transmitting, using the external data communication interface, data related to the world marker to outside receivers.
8. The apparatus of claim 1, wherein the apparatus is caused to perform:
- obtaining a position of the drone on the ground in a world coordinate system;
- obtaining a position of the drone on the ground in an augmented reality coordinate system of the apparatus;
- locking the position of the drone in the augmented reality coordinate system with the position of the drone in the world coordinate system;
- obtaining a heading of the drone on the ground; and
- setting the heading as an orientation of a calibration heading symbol in the augmented reality coordinate system of the apparatus.
9. The apparatus of claim 1, wherein the apparatus is caused to perform:
- superimposing, on the augmented reality display, an indirect line of sight guideline extending horizontally to a geographic location of the drone on the ground, from which the indirect line of sight guideline continues to extend vertically to the target symbol in a cruising altitude of the drone in the air while the human operator is looking towards the drone in the air.
10. The apparatus of claim 1, wherein the apparatus is caused to perform:
- superimposing, on the augmented reality display, a map showing a geographic location of the human operator, a geographic location of the drone, and a waypoint; and
- superimposing, on the augmented reality display a vertical waypoint symbol starting from a geographic location of the waypoint on the ground and extending towards a predetermined altitude of the waypoint while the human operator is looking towards the drone in the air.
11. The apparatus of claim 1, wherein the apparatus is caused to perform:
- superimposing, on the augmented reality display, one or more visual elements based on data captured in real-time using one or more sensors onboard the drone in the vicinity of the target symbol while the human operator is looking towards the drone in the air; and
- positioning, on the augmented reality display, the one or more visual elements so that a line of sight remains unobstructed while the human operator is looking towards the drone in the air.
12. The apparatus of claim 1, wherein the apparatus is caused to perform:
- superimposing, on the augmented reality display, a map in a vertical layout showing a geographic location of the human operator and a geographic location of the drone in the vicinity of the target symbol on the augmented reality display while the human operator is looking towards the drone in the air; or
- superimposing, on the augmented reality display, a map in a horizontal layout showing a geographic location of the human operator and a geographic location of the drone while the human operator is looking towards the ground.
13. The apparatus of claim 1, wherein the apparatus is caused to perform:
- superimposing, on the augmented reality display, the data related to the flying while the human operator is looking towards the drone in the air with a visual line of sight to the drone during a good visibility, or during an impaired visibility with an augmented line of sight to the drone, or during an obstructed visibility with an augmented and simulated line of sight to the drone, or during a long-distance visibility with an augmented line of sight to the drone.
14. A method for assisting a human operator in flying a drone using a remote controller, comprising:
- receiving data related to the flying from the remote controller;
- displaying, on an augmented reality display, the data related to the flying to the human operator;
- superimposing, on the augmented reality display, a target symbol indicating a position of the drone while the human operator is looking towards the drone;
- superimposing, on the augmented reality display, an orientation symbol indicating an orientation of the drone while the human operator is looking towards the drone;
- obtaining a geographic location related to the drone; and
- setting a world marker on the obtained geographic location.
15. A non-transitory computer-readable medium comprising computer program code, which, when executed by one or more processors, causes performance of a method for assisting a human operator in flying a drone using a remote controller, comprising:
- receiving data related to the flying from the remote controller;
- displaying, on an augmented reality display, the data related to the flying to the human operator;
- superimposing, on the augmented reality display, a target symbol indicating a position of the drone while the human operator is looking towards the drone;
- superimposing, on the augmented reality display, an orientation symbol indicating an orientation of the drone while the human operator is looking towards the drone;
- obtaining a geographic location related to the drone; and
- setting a world marker on the obtained geographic location.
Type: Application
Filed: Nov 29, 2021
Publication Date: Aug 25, 2022
Inventors: Hannu LESONEN (Helsinki), Lassi IMMONEN (Helsinki)
Application Number: 17/536,500