UNMANNED AERIAL VEHICLE LANDING SYSTEM
An unmanned aerial vehicle landing system is disclosed. According to one aspect of the present invention, provided is an unmanned aerial vehicle landing system comprising: a step in which a server receives, from a wireless terminal, pixel coordinates of image data of an image sensor provided in an unmanned aerial vehicle; a coordinate transformation step in which the server transforms the pixel coordinates to absolute coordinates of the unmanned aerial vehicle on the basis of status information of the unmanned aerial vehicle; and a step in which the server transmits the absolute coordinates to the unmanned aerial vehicle. when receiving image data captured by an image sensor included in an unmanned aerial vehicle, displaying the image data on a display included in the user terminal, by a user terminal; transforming pixel coordinates into absolute coordinates of the unmanned aerial vehicle based on information on a state of the unmanned aerial vehicle, by the user terminal; and transmitting the absolute coordinates to the unmanned aerial vehicle, by the user terminal.
Latest Argosdyne Co. Ltd. Patents:
The present invention relates to an unmanned aerial vehicle, and more particularly to an unmanned aerial vehicle landing system.
BACKGROUND OF INVENTIONCurrently, as the market for drones, which are unmanned aerial vehicles, is currently active, related industries such as delivery drones, pickup services, and provision of emergency supplies have expanded. Several related technologies are currently under development, but a technical problem arises in landing a drone at a service location due to a GPS position error or an obstacle such as a nearby structure.
In more detail, when location information of a recipient (user) is input, the drone flies toward the recipient through autonomous flight to the corresponding position as a destination. However, current GPS technology causes an error of 10 to 50 m, and especially, when there is an obstacle that obstructs a view of the drone (e.g., in a forest) or when unspecified people including the recipient is located at the destination (e.g., a park), there is a problem in that it is difficult to specify a landing point of the drone.
In addition, image processing and flight control technology have not been developed to the extent that a drone is capable of landing safely avoiding crowds including a user. As a cited reference related thereto, disclosed is “Human interaction with unmanned aerial vehicles” (U.S. Pat. No. 9,456,620, registered on Oct. 4, 2016). The cited reference discloses that a person guides an unmanned aerial vehicle through a gesture, etc. However, the gesture needs to be recognized through image processing or the like and it is impossible to inform a drone of an exact landing point through the gesture.
SUMMARY OF INVENTOIN Technical Problem to be SolvedThe present invention provides an unmanned aerial vehicle landing system. More particularly, the present invention provides a method of more safely landing an unmanned aerial vehicle by checking a landing point in the form of image data and selecting a landing place by a user.
Technical SolutionTherefore, the present invention provides an unmanned aerial vehicle landing system for performing a method including receiving pixel coordinates of image data of an image sensor included in an unmanned aerial vehicle from a wireless terminal, by a server, transforming the pixel coordinates into absolute coordinates of the unmanned aerial vehicle based on information on a state of the unmanned aerial vehicle, by the server, and transmitting the absolute coordinates to the unmanned aerial vehicle, by the server.
The method may further include, prior to the receiving the pixel coordinates, determining whether an imaging surface of the image data is kept horizontal to a ground based on at least one of azimuth information, acceleration information, or angular velocity information of the unmanned aerial vehicle, by the server, and transmitting a control command for staying the image surface of the image data horizontal, by the server.
The method may further include, upon receiving a request for imaging from the wireless terminal, transmitting a command for capturing the image data by the image sensor included in the unmanned aerial vehicle, by the server, and receiving the image data from the unmanned aerial vehicle and transmitting the image data to the wireless terminal, by the server.
The method may further include, upon receiving a user command from the wireless terminal, transmitting the user command to the unmanned aerial vehicle, by the server.
The method may further include, when the user command is a landing command, transmitting a command for controlling the unmanned aerial vehicle to land on the absolute coordinates to the unmanned aerial vehicle, by the server.
The method may further include, when the user command is a moving command, transmitting a command for controlling the unmanned aerial vehicle to move to moving coordinates to the unmanned aerial vehicle, by the server.
The server may transform the pixel coordinates into the absolute coordinates using the pixel coordinates, altitude information of the unmanned aerial vehicle, and a viewing angle of the image sensor as a parameter in the operation of transforming the pixel coordinates.
The server may correct the image data using a radial distortion constant, a tangential distortion constant, a focal distance, and lens-based image coordinates of a lens included in the image sensor.
The method may further include, receiving the information of the state of the unmanned aerial vehicle, including at least one of altitude information, location information, azimuth information, acceleration information, and angular velocity information of the unmanned aerial vehicle, or image data captured from the image sensor of the unmanned aerial vehicle, from the unmanned aerial vehicle, by the server.
The operation of transforming the pixel coordinates may include extracting recognition points from structure data that is captured a plurality of times for respective distances, by the server, matching the recognition points in the image data to specify a structure, by the server, and transforming the pixel coordinates into the absolute coordinates with reference to information on a size of the specified structure from a structure database, by the server.
The method may further include, upon receiving a request of imaging from the wireless terminal, requesting user authentication information to the wireless terminal, by the server, and upon receiving the user authentication information from the wireless terminal, determining whether the wireless terminal is appropriate based on the user authentication information, by the server.
The present invention provides an unmanned aerial vehicle landing system for performing a method including receiving pixel coordinates of image data of an image sensor included in an unmanned aerial vehicle from a wireless terminal, by the unmanned aerial vehicle, and transforming the pixel coordinates into absolute coordinates of the unmanned aerial vehicle based on information on a state of the unmanned aerial vehicle, by the unmanned aerial vehicle.
The method may further include determining whether an imaging surface of the image data is kept horizontal to a ground based on at least one of azimuth information, acceleration information, or angular velocity information of the unmanned aerial vehicle, by the unmanned aerial vehicle, and controlling a posture of the unmanned aerial vehicle to keep the unmanned aerial vehicle horizontal to an imaging surface of the image data.
The method may further include, prior to the receiving the pixel coordinates, upon receiving a request for imaging from the wireless terminal, capturing the image data through the image sensor included in the unmanned aerial vehicle, by the unmanned aerial vehicle, and transmitting the image data to the wireless terminal.
The present invention provides an unmanned aerial vehicle landing system for performing a method including, when receiving image data captured by an image sensor included in an unmanned aerial vehicle, displaying the image data on a display included in the user terminal, by a user terminal, transforming pixel coordinates into absolute coordinates of the unmanned aerial vehicle based on information on a state of the unmanned aerial vehicle, by the user terminal, and transmitting the absolute coordinates to the unmanned aerial vehicle, by the user terminal.
Effect of InventionAs described above, the present invention may provide an unmanned aerial vehicle landing system.
As the present invention allows for various changes and numerous embodiments, particular embodiments will be illustrated in the drawings and described in detail in the written description. However, this is not intended to limit the present invention to particular modes of practice, and it is to be appreciated that all changes, equivalents, and substitutes that do not depart from the spirit and technical scope of the present invention are encompassed in the present invention. In the description of the present invention, certain detailed explanations of related art are omitted when it is deemed that they may unnecessarily obscure the essence of the present invention.
Terms such as “first” and “second” are used herein merely to describe a variety of constituent elements and the constituent elements are not limited by the terms. The terms are used only for the purpose of distinguishing one constituent element from another constituent element.
The terms used in the present specification are used for explaining a specific exemplary embodiment, not limiting the present invention. Thus, singular expressions in the present specification encompass the plural expressions unless clearly specified otherwise in context. Also, terms such as “include” or “comprise” may be construed to denote a certain characteristic, number, step, operation, constituent element, or a combination thereof, but may not be construed to exclude the existence of or a possibility of addition of one or more other characteristics, numbers, steps, operations, constituent elements, or combinations thereof.
DESCRIPTION OF REFERENCE NUMERALS10: unmanned aerial vehicle
20: user wireless terminal
30: server
MODE FOR INVENTIONHereinafter, the present invention will be described in detail by explaining exemplary embodiments of the present invention with reference to the attached drawings. The same reference numerals in the drawings denote like elements, and a repeated explanation thereof will not be given.
Terms such as first, second, etc. used hereinafter are merely identification symbols for distinguishing the same or corresponding constituent elements, and the same or corresponding constituent elements are not limited by terms such as first and second.
In addition, the term “bond” does not mean only direct physical contact between components and may be used as a concept that encompasses the case in which the components are in contact with each other by interposing another component therebetween.
In addition, with respect to a computer program, the suffixes “module” and/or “unit” for components correspond to a part of a computer program for using a computer as a device of a specific function or for realizing the specific function in the computer. For example, the module A may be interpreted as a computer program for making the computer function as device A or a computer program for realizing the function A in the computer. As a method, the “operation” may be implemented and executed as a computer program in a computer.
The “module” and/or “unit” may constitute the “group”.
An application refers to a set of a series of computer programs created to perform a specific task and is also referred to as an application program. A user may add related functions by installing the application according to an embodiment of the present invention in his or her electronic device.
The electronic device of the user in which the application is installed may include a CPU, RAM, ROM, or a storage device like a computer, a tablet computer, or a smartphone, may have an environment in which an entire system is controlled by a graphical operating system such as Windows, iOS, Android, or Linux, and particularly, may be specialized for a smartphone for sending and receiving phone calls and texts to registered contacts, capturing an image using a camera installed therein, and transmitting the captured image.
In addition, the flowchart in the drawings attached to the present specification is only for explaining the invention, and does not need to be a flowchart to be completely implemented without bugs on a computer.
The terminal mentioned in this specification is a general user terminal and may be a smartphone, a personal computer, a tablet computer, or the like.
Referring to
A user according to the present invention may be any user who is serviced by an unmanned aerial vehicle, such as a recipient of goods delivered by an unmanned aerial vehicle, a pickup service user who requests an unmanned aerial vehicle service to deliver goods to another place, or a user who requests a drone on-demand videography service.
In general, a server may exchange data with an unmanned aerial vehicle and a user wireless terminal through wireless communication, but the present invention is not limited to wireless communication, and thus data may be exchanged through wired communication. In this specification, a description of technology of data exchange is omitted.
The server may receive information received from the unmanned aerial vehicle when the unmanned aerial vehicle approaches the server in order to provide a service using the unmanned aerial vehicle, such as goods delivery. Here, information received from the unmanned aerial vehicle may include flight-related information, status information, user information, location information, and the like of the unmanned aerial vehicle.
In more detail, the server may transmit information on the unmanned aerial vehicle, received from the unmanned aerial vehicle, to a user terminal. The server may continuously monitor a situation depending on whether the unmanned aerial vehicle is abnormal, recognition of a flight path, the state of a user, or the like and may calculate routing data of the unmanned aerial vehicle, required for landing and guidance, based on relative location information between the user and the unmanned aerial vehicle, altitude information, azimuth information, or the like when the unmanned aerial vehicle and the user are authenticated. The server may calculate the position of a landing point based on location information, azimuth information, and the like of the user when the unmanned aerial vehicle lands and may transmit the position of the landing point to the unmanned aerial vehicle.
The unmanned aerial vehicle may continuously transmit any state information and flight-related information of an aircraft, including a position, a speed, an azimuth, or a battery status to the server. The unmanned aerial vehicle may transmit the altitude, azimuth, and posture of the unmanned aerial vehicle, or the like at the time of arrival at coordinates of a recipient to the server.
According to an embodiment of the present invention, the server may receive information on the state of the unmanned aerial vehicle through an operation of receiving the information on the state of the unmanned aerial vehicle including at least one of altitude information, location information, azimuth information, acceleration information, and angular velocity information of the unmanned aerial vehicle, or image data captured by an image sensor of the unmanned aerial vehicle, from the unmanned aerial vehicle.
The user wireless terminal may transmit data and commands such as location, angle, azimuth, or altitude information of the wireless terminal. The user wireless terminal may display information and a situation of the unmanned aerial vehicle, which are transmitted from the server.
According to an embodiment of the present invention, the unmanned aerial vehicle landing system may perform a method including an operation of receiving pixel coordinates of image data of an image sensor included in the unmanned aerial vehicle from a wireless terminal, by the server, a coordinate transformation operation of transforming the pixel coordinates into absolute coordinates of the unmanned aerial vehicle based on the information on the state of the unmanned aerial vehicle, by the server, and an operation of transmitting the absolute coordinates to the unmanned aerial vehicle, by the server.
In more detail, the unmanned aerial vehicle may move to a service provision point in order to provide a service. The server or the unmanned aerial vehicle may continuously monitor altitude information, location information, azimuth information, and posture information of the unmanned aerial vehicle from a sensor of the unmanned aerial vehicle. In addition, the server or the unmanned aerial vehicle may determine whether the unmanned aerial vehicle arrives at the service provision point based on the received location information.
In more detail, when the unmanned aerial vehicle arrives at the service provision point in order to provide a service using the unmanned aerial vehicle to a recipient who has a wireless terminal, the server or the unmanned aerial vehicle may photograph a landing point using the image sensor included in the unmanned aerial vehicle. In this case, the image data may be captured while a lens of a photographing device is kept horizontal to a ground surface in the state in which the unmanned aerial vehicle hovers to maintain a constant altitude.
The image sensor may convert an optical image formed on a lens into an electrical signal and may include a charge coupled device (CCD), a metal oxide semi-conductor (MOS), and a complementary metal-oxide semiconductor (CMOS). However, the type of the image sensor may not be limited thereto. In addition, the image may be interpreted as a broad concept including not only a digital signal converted from an optical signal but also a result obtained by outputting the digital signal as light visualized through a display device.
Imaging or capturing image data may refer to a series of procedures of converting an optical signal into image data using the image sensor. In more detail, the image data may mean still image data and may refer to data stored by digitizing an image, which is imaged through the image sensor, in units of pixels. The image data may be stored compressed, decompressed, or in vector format and may be expressed as a quadratic matrix including only a planar position of a pixel or a cubic matrix including color information. In addition, the image data may be stored in the form of ani, bmp, cal, fax, gif, hdp, img, jpe, jpec, jpg, mac, pbm, pcd, pct, pcx, pgm, png, ppm, psd, ras, tga, tif, tiff, or wmf, which are image file formats.
According to an embodiment of the present invention, the server may control the unmanned aerial vehicle to keep the image horizontal based on at least one of azimuth information, acceleration information, or angular velocity information of the unmanned aerial vehicle through an operation of determining whether an imaging surface of the image data is kept horizontal, and an operation of transmitting a control command to the unmanned aerial vehicle by the server in order to keep the image surface of the image data horizontal.
The unmanned aerial vehicle may directly control a posture thereof based on at least one of azimuth information, acceleration information, or angular velocity information of the unmanned aerial vehicle through the operation of determining whether the imagiwhile stayingng surface of the image data is kept horizontal to the ground, and an operation of controlling the unmanned aerial vehicle to keep the unmanned aerial vehicle horizontal to the imaging surface of the image data.
The unmanned aerial vehicle may include a sensor such as a magnetometer, a 3-axis accelerometer, or a 3-axis gyroscope in order to control the posture of the unmanned aerial vehicle. The server or the unmanned aerial vehicle may calculate the posture (Roll and Pitch) of the unmanned aerial vehicle based on the acceleration measured by the accelerometer and the angular velocity measured by the gyroscope.
The imaging surface may refer to the area of a film or a device that receives light from a light collector of a camera, such as a CCD or CMOS sensor of a digital camera and may mean a part of the image sensor, on which an image of a substrate is formed. When the imaging surface of the image data is horizontal to the ground surface, this may mean that a distance between a lens and an imaged part of the ground is not changed even if an image of image data is horizontal to the ground surface and any point of the image data is symmetrically moved up, down, left, and right. In a general photographing device, the imaging surface of the image sensor is horizontal to the lens, and thus keeping the lens horizontal to the ground and keeping the imaging surface horizontal to the ground may have the same meaning. In this case, the ground may refer to an imaginary plane formed when the ground surface is assumed to be a plane.
The server or the unmanned aerial vehicle may determine whether image data is captured in the state in which the lens of the image sensor is kept horizontal to a landing point using posture information and image data of the unmanned aerial vehicle. The server or the unmanned aerial vehicle may be controlled to capture image data while staying horizontal. By staying the lens of the image sensor horizontal to the landing point, the captured image data may be captured horizontally to the ground surface, and thus a calculation procedure of transforming pixel coordinates of the image data into absolute coordinates may be simplified.
The request for imaging may be transmitted to the unmanned aerial vehicle directly from the user wireless terminal or transmitted to the unmanned aerial vehicle through the server, and the unmanned aerial vehicle may capture image data through a series of procedures of capturing image data. The captured image data may be transmitted to the user wireless terminal directly from the unmanned aerial vehicle or may be transmitted to the user wireless terminal through the server.
When the user wireless terminal transmits directly to the unmanned aerial vehicle or the server receives the request of imaging from the user wireless terminal, the user may request image data to the unmanned aerial vehicle through an operation of sending a command to cause the image sensor included in the unmanned aerial vehicle to capture the image data, and an operation of receiving the image data from the unmanned aerial vehicle and transmitting the image data to the wireless terminal by the server.
In this case, before receiving the request for imaging of the wireless terminal, a user authentication procedure may be performed. For example, the authentication procedure may include an operation of requesting user authentication information to the wireless terminal when the server receives the request for imaging from the wireless terminal, and an operation of determining whether the wireless terminal is appropriate based on the user authentication information when the server receives the user authentication information from the wireless terminal. In this case, communication between the wireless terminal, the server, and the unmanned aerial vehicle may be encrypted and performed, thereby improving security of communication.
The captured image data may be transmitted to the user terminal through the server. The transmitted image data may be registered and may be displayed on a display included in the user terminal. The image data displayed on the display may be configured to allow the user to select an arbitrary point, and simultaneously, may include a specific point for allowing the user to select a command to be issued by the user through an operation such as clicking.
In this case, a selection point may refer to a specific point on an actual ground surface corresponding to the point selected by the user in the image data captured when the lens of the image sensor is horizontal to the ground surface. In this case, coordinates of the selection point based on the origin on the ground surface perpendicular to the unmanned aerial vehicle in space may be referred to as absolute coordinates.
In more detail, for example, when the user terminal displays the image data of the landing point on the display and the user touches and selects an arbitrary point of the image data, the wireless terminal may refer to the pixel coordinate value of the corresponding selection point. The wireless terminal may query a user command through various methods (e.g., a text or a voice message) before and after the user selects an arbitrary point.
The user may select a desired user command through manipulation such as clicking, and the pixel coordinates of the selected user command and the selection point may be transmitted directly to the unmanned aerial vehicle or transmitted to the unmanned aerial vehicle through the server. The user command may be a command that is transmitted from the user terminal and is transmitted directly to the unmanned aerial vehicle or is transmitted to the unmanned aerial vehicle through the server to control an operation such as movement or landing of the unmanned aerial vehicle. The user command and the movement of the unmanned aerial vehicle based thereon will be described below in detail with reference to
In this case, in the coordinate transformation operation, the server may transform the pixel coordinates into the absolute coordinates using the pixel coordinates, altitude information of the unmanned aerial vehicle, and a viewing angle of the image sensor as a parameter.
Referring to
Referring to
[86] Image data may be captured in the state in which the lens is kept horizontal to the ground surface, and pixel coordinates may be transformed into absolute coordinates using a viewing angle of the lens and the altitude of the unmanned aerial vehicle as a parameter. A detailed transformation formula between the pixel coordinates and the absolute coordinates is as follows.
(XT: absolute coordinates, XPT: pixel coordinates, H: altitude of unmanned aerial vehicle, θ: viewing angle of photographing device, and DXP: distance to end point of image on x axis from origin on image (in units of pixels))
When image data is captured in the state in which the lens is kept horizontal to the ground surface, there are several proportional relationships between pixel coordinates and absolute coordinates. First, a predetermined ratio between a length on the ground surface and the number of pixels in the image data may be maintained. The viewing angle may be a unique value depending on the type of a lens and may refer to an angle between the center of the lens and the range within which it is possible to capture an image. The maximum region in which the lens is capable of capturing an image through an image sensor may be calculated using the viewing angle. The maximum region in which the lens is capable of capturing an image may be calculated as an actual length using a tangent value of the altitude and the viewing angle. In this case, a predetermined ratio between both ends in the image data and the maximum region in which the lens is capable of capturing an image may be maintained, and a procedure of transforming a pixel coordinate value into absolute coordinates using a proportional relationship of each length may be summarized by the above equation. Accordingly, the absolute coordinates of the unmanned aerial vehicle may be calculated using the altitude, the viewing angle, and the pixel coordinates of the unmanned aerial vehicle as a parameter. In this case, a calculation procedure may be performed by any device of the unmanned aerial vehicle, the server, and the wireless terminal.
According to an embodiment of the present invention, the altitude may be measured by sensors, such as an acceleration sensor, an altimeter, or an odometer, included in the unmanned aerial vehicle or a combination thereof.
In general, the altitude of the unmanned aerial vehicle may be measured by the acceleration sensor. However, data measured by the acceleration sensor may be acquired by integrating acceleration to calculate altitude, and it is difficult to accurately measure surface altitude due to accumulated integral errors.
An odometer for measuring the distance between the unmanned aerial vehicle and the ground using ultrasonic, laser or LiDAR based-sensors may be used. In addition, a satellite altimeter for calculating the length of a GPS satellite signal and representing an altitude value may be used.
More accurate altitude values may be calculated by combining measured values from a plurality of sensors. For example, the altitude of the unmanned aerial vehicle may be more accurately measured by combining altitude data measured by the acceleration sensor, the odometer, the altimeter, and the image sensor using an extended Kalman filter.
According to an embodiment of the present invention, a procedure of transforming pixel coordinates into absolute coordinates may be performed by analyzing information on various structures included in the image data. In more detail, the procedure may include an operation of extracting recognition points from structure data that is captured a plurality of times for respective distances by a server, an operation of matching the recognition points in the image data to specify a structure by the server, and an operation of calculating altitude information with reference to information on the size of the specified structure in a database of the structure by the server.
With reference to
In more detail, structure data may be stored in the structure database included in the server or the unmanned aerial vehicle. The structure database may be a database for storing entire information on the structure of each structure and the information on the structure may include information on the material characteristics, geometry, size, length, and image data of the structure.
The structure data may correspond to an image captured by photographing the structure while a distance is changed in the state in which the lens is kept horizontal to the ground surface, and the server may extract the recognition point based on the structure data. In this case, the recognition point may refer to a characteristics part for recognizing the specific structure when the specific structure is compared with another structure. Each recognition point may be determined through an algorithm for extracting the recognition point.
For example, a SIFT algorithm, a SURF algorithm, or the like may be used as the algorithm for extracting the recognition point. The SIFT algorithm and the SURF algorithm may include an operation of searching for the recognition point and an operation of comparing and matching the recognition points. In this case, the recognition point may be a point, the size and direction of which are not expected to be changed, and may use a Gaussian filter and difference of Gaussian (DoG). However, a method of extracting the recognition point is not limited by the described algorithm.
The server may search for the recognition point to recognize the structure by combining the structure data and may refer to the information on the size of the structure from the structure database. In this case, a ratio between a pixel number and an actual size may be calculated using information on the size of the referenced structure and the number of pixels of the recognized structure, and pixel coordinates may be transformed into absolute coordinates using the calculated ratio.
In this case, the structure data may also be captured while staying horizontal like the image data, and thus even if the structure data is captured while a distance is changed, a predetermined ratio between recognition points may be maintained. Accordingly, pixel distances between the recognition points may be compared, and an actual distance between the recognition points may be calculated with reference to the geometry and the information on the size of the structure, stored in the structure database. The pixel coordinates may be transformed into the absolute coordinates using a ratio between the actual distance and the pixel distance of the recognition points.
A passenger car shown in the image data of
The server may compare and analyze the image data received from the unmanned aerial vehicle with the recognition point stored in the structure database and may determine whether a specific passenger car is contained in the image data. Pixel coordinates may be transformed into absolute coordinates using a ratio between a pixel distance and an actual distance, information on the size of the structure, and a geometric shape, which are stored in the structure database, as a parameter.
The procedure of transforming the pixel coordinates into the absolute coordinates using the structure may be performed by the wireless terminal or the unmanned aerial vehicle itself as well as through the server.
In this case, an error may occur due to distortion of the lens when the pixel coordinates are transformed into the absolute coordinates using the image data. Distortion of the lens may include radial distortion due to a refractive index of a convex lens and tangential distortion that occurs because a camera lens and an image sensor are not horizontal to each other or the lens itself is not centered during a procedure of manufacturing an imaging device.
When there is no distortion in a lens system, one point on a three-dimensional space may be projected to one point on a normalized image plane to have linearity through pinhole projection. However, distortion may occur due to the nonlinearity of the lens system, and image data may be corrected by applying a distortion model of the lens system and an internal parameter of an imaging device.
The distortion model of the lens system is as follows.
(k1, k2, k3: radial distortion constant, p1, p2: tangential distortion constant, (xn
Distortion of the image data may be corrected using a radial distortion constant, a tangential distortion constant, a focal distance, and lens-based image coordinates of the lens included in the image sensor as a parameter based on the distortion model of the lens system. Distortion of the image data may be corrected irrespective of a device such as a server, a user terminal, or an unmanned aerial vehicle.
Referring to
Upon receiving the user command from the wireless terminal, the server may transmit a command of the wireless terminal to the server through the operation of transmitting the user command to the unmanned aerial vehicle by the user. In this case, when the user command is the landing command, the server may transmit a command for controlling the unmanned aerial vehicle to land on the absolute coordinates to the unmanned aerial vehicle, and when the user command is the moving command, the server may transmit a command for controlling the unmanned aerial vehicle to move to the moving coordinates to the unmanned aerial vehicle.
The server may receive information on the unmanned aerial vehicle from the unmanned aerial vehicle. When the unmanned aerial vehicle arrives at a service provision point, the user terminal may be notified of arrival of the unmanned aerial vehicle using various methods (e.g., a message or voice notification). In this case, the user may select the user command by manipulating the user terminal.
Here, the user command may refer to a series of commands for directly manipulating the unmanned aerial vehicle by the user terminal or manipulating the unmanned aerial vehicle through the server by the user terminal. According to the present invention, the user command may refer to a moving command or a landing command.
Upon receiving notification of arrival from the unmanned aerial vehicle, the user terminal may display a button or the like for selecting the user command through manipulation such as clicking or dragging by the user.
In this case, the unmanned aerial vehicle may transmit image data captured by an image sensor of the unmanned aerial vehicle together with notification of arrival. Referring to
When the user selects the landing command through the user terminal, the server may receive the landing command and the pixel coordinates. The absolute coordinates of the selection point may be calculated through the aforementioned procedure of transforming coordinates. In this case, the landing command may be a command for landing the unmanned aerial vehicle on the selection point on the ground surface, selected through the procedure of transforming coordinates.
Referring to
The user may select the moving command. When the moving command is selected, the server may transmit a command for moving the unmanned aerial vehicle to the moving coordinates to the unmanned aerial vehicle. Upon receiving the moving command, the unmanned aerial vehicle may move the moving point at a predetermined altitude. After moving to the moving point, the unmanned aerial vehicle may notify the wireless terminal of completion of movement.
When there is the moving command for the unmanned aerial vehicle, the server may control the unmanned aerial vehicle to move to the moving point perpendicular to the selection point while maintaining the altitude. The unmanned aerial vehicle may completely move to the moving point, and the image data may be captured using the image sensor while being kept horizontal.
The captured image data may be transmitted to the user wireless terminal through the server, the user may specify the selection point through manipulation, and a point on which an aerial vehicle is supposed to land may be selected by repeatedly performing a series of user command selection procedures of selecting the moving command or the landing command of the aerial vehicle.
In this case, the landing command, the moving command, and the image data may be transmitted directly to the user terminal or the unmanned aerial vehicle without going through the server, and the procedure of transforming pixel coordinates of image data into absolute coordinates of the unmanned aerial vehicle may be performed by any one of three devices of the server, the unmanned aerial vehicle, and the wireless terminal.
According to an embodiment of the present invention, upon receiving image data captured by an image sensor included in the unmanned aerial vehicle, the user terminal may process data through an operation of displaying the image data on a display included in the user terminal, a coordinate transformation operation of transforming the pixel coordinates into absolute coordinates of the unmanned aerial vehicle based on information on the state of the information on the state of the unmanned aerial vehicle by the user terminal, and an operation of transmitting the absolute coordinates to the unmanned aerial vehicle by the user terminal.
The aforementioned method and processing, for example, commands for execution by a processor, a controller, or other processing devices, may be encoded, or may also be stored in machine-readable or computer-readable media, such as a compact disc read-only memory (CDROM), a magnetic disc, an optical disc, a flash memory, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM), or other machine-readable media.
Such a medium may be implemented as any device that contains, stores, communicates, propagates, or moves executable commands to be used by a command executable system, apparatus, or device or to be used in connection therewith. Alternatively or additionally, the medium may be implemented in analog or digital logic using hardware such as one or more integrated circuits, or one or more processor-executable commands, in software of available functions in an application programming interface (API), a dynamic link library (DLL), or a memory defined or shared as local or remote process call, or in a combination of hardware and software.
According to other embodiments, the method may be represented as a signal or a radio-signal medium. For example, commands for implementing logic of an arbitrary predetermined program may be configured in the form of electrical, magnetic, optical, electromagnetic, infrared, or other types of signals. The aforementioned system may receive these signals at a communication interface such as a fiber optic interface, an antenna, or other analog or digital signal interfaces, may restore commands from the signal, may store the commands in a machine-readable memory, and/or may execute the commands using a processor.
In addition, the present invention may be implemented in hardware or software. The present invention may also be implemented as computer-readable code on a computer-readable recording medium. The computer-readable recording medium may include all types of recording devices that store data read by a computer system. Examples of computer-readable recording media may include ROM, RAM, CD-ROM, magnetic tapes, floppy disks, optical data storage devices, etc. and may also be implemented in the form of carrier waves (for example, transmission through the Internet). In addition, the computer-readable recording medium may be distributed over a computer system connected through a network and computer-readable codes may be stored and executed in a distributed manner. In addition, functional programs, codes, and code segments for implementing the present invention may be easily inferred by programmers in the art to which the present invention pertains.
Embodiments of the present invention may include a carrier wave having electronically readable control signals, which is operated by a programmable computer system in which one of the methods described herein is executed. Embodiments of the present invention may be implemented as a computer program compiled from program code, and the program code may be operated to execute one of methods used when the computer program is driven on a computer. The program code may be stored on, for example, a machine-readable carrier. An embodiment of the present invention may relate to a computer program having a program code for executing one of the methods described herein when the computer program is driven on a computer. The present invention may include a computer, or programmable logic device, for executing one of the aforementioned methods described above. A programmable logic device (e.g., a field programmable gate array or a complementary metal oxide semiconductor-based logic circuit) may be used to perform some or all functions of the aforementioned methods.
Although an embodiment of the present invention has been described thus far, it would be obvious to one of ordinary skill in the art that the present invention is changed and modified in various ways by adding, changing, deleting, or modifying components within the scope of the present invention without departing from the spirit of the present invention described in the claims, and this may also be contained in the scope of the present invention.
INDUSTRIAL APPLICABILITYAccording to the present invention, the unmanned aerial vehicle may safely and rapidly land.
In detail, a user may check a landing point in the form of image data, may select a landing place, and may more safely land the unmanned aerial vehicle.
Claims
1. An unmanned aerial vehicle landing system for performing a method comprising:
- receiving pixel coordinates of image data of an image sensor included in an unmanned aerial vehicle from a wireless terminal, by a server;
- transforming the pixel coordinates into absolute coordinates of the unmanned aerial vehicle based on information on a state of the unmanned aerial vehicle, by the server; and
- transmitting the absolute coordinates to the unmanned aerial vehicle, by the server.
2. The unmanned aerial vehicle landing system of claim 1, wherein the method further includes:
- prior to the receiving the pixel coordinates, determining whether an imaging surface of the image data is kept horizontal to a ground based on at least one of azimuth information, acceleration information, or angular velocity information of the unmanned aerial vehicle, by the server; and
- transmitting a control command for staying the image surface of the image data horizontal, by the server.
3. The unmanned aerial vehicle landing system of claim 1, wherein the method further includes:
- upon receiving a request for imaging from the wireless terminal, transmitting a command for capturing the image data by the image sensor included in the unmanned aerial vehicle, by the server; and
- receiving the image data from the unmanned aerial vehicle and transmitting the image data to the wireless terminal, by the server.
4. The unmanned aerial vehicle landing system of claim 1, wherein the method further includes:
- upon receiving a user command from the wireless terminal, transmitting the user command to the unmanned aerial vehicle, by the server.
5. The unmanned aerial vehicle landing system of claim 4, wherein the method further includes:
- when the user command is a landing command, transmitting a command for controlling the unmanned aerial vehicle to land on the absolute coordinates to the unmanned aerial vehicle, by the server.
6. The unmanned aerial vehicle landing system of claim 4, wherein the method further includes:
- when the user command is a moving command, transmitting a command for controlling the unmanned aerial vehicle to move to moving coordinates to the unmanned aerial vehicle, by the server.
7. The unmanned aerial vehicle landing system of claim 1, wherein the server transforms the pixel coordinates into the absolute coordinates using the pixel coordinates, altitude information of the unmanned aerial vehicle, and a viewing angle of the image sensor as a parameter in the operation of transforming the pixel coordinates.
8. The unmanned aerial vehicle landing system of claim 1, wherein the server corrects the image data using a radial distortion constant, a tangential distortion constant, a focal distance, and lens-based image coordinates of a lens included in the image sensor.
9. The unmanned aerial vehicle landing system of claim 1, wherein the method further includes:
- receiving the information of the state of the unmanned aerial vehicle, including at least one of altitude information, location information, azimuth information, acceleration information, and angular velocity information of the unmanned aerial vehicle, or image data captured from the image sensor of the unmanned aerial vehicle, from the unmanned aerial vehicle, by the server.
10. The unmanned aerial vehicle landing system of claim 1, wherein the operation of transforming the pixel coordinates includes:
- extracting recognition points from structure data that is captured a plurality of times for respective distances, by the server;
- matching the recognition points in the image data to specify a structure, by the server; and
- transforming the pixel coordinates into the absolute coordinates with reference to information on a size of the specified structure from a structure database, by the server.
11. The unmanned aerial vehicle landing system of claim 3, wherein the method further includes:
- upon receiving a request of imaging from the wireless terminal, requesting user authentication information to the wireless terminal, by the server; and
- upon receiving the user authentication information from the wireless terminal, determining whether the wireless terminal is appropriate based on the user authentication information, by the server.
12. An unmanned aerial vehicle landing system for performing a method comprising:
- receiving pixel coordinates of image data of an image sensor included in an unmanned aerial vehicle from a wireless terminal, by the unmanned aerial vehicle; and
- transforming the pixel coordinates into absolute coordinates of the unmanned aerial vehicle based on information on a state of the unmanned aerial vehicle, by the unmanned aerial vehicle.
13. The unmanned aerial vehicle landing system of claim 12, wherein the method further includes:
- determining whether an imaging surface of the image data is kept horizontal to a ground based on at least one of azimuth information, acceleration information, or angular velocity information of the unmanned aerial vehicle, by the unmanned aerial vehicle; and
- controlling a posture of the unmanned aerial vehicle to keep the unmanned aerial vehicle horizontal to an imaging surface of the image data.
14. The unmanned aerial vehicle landing system of claim 12, wherein the method further includes:
- prior to the receiving the pixel coordinates, upon receiving a request for imaging from the wireless terminal, capturing the image data through the image sensor included in the unmanned aerial vehicle, by the unmanned aerial vehicle; and
- transmitting the image data to the wireless terminal.
15. An unmanned aerial vehicle landing system for performing a method comprising:
- when receiving image data captured by an image sensor included in an unmanned aerial vehicle, displaying the image data on a display included in the user terminal, by a user terminal;
- transforming pixel coordinates into absolute coordinates of the unmanned aerial vehicle based on information on a state of the unmanned aerial vehicle, by the user terminal; and
- transmitting the absolute coordinates to the unmanned aerial vehicle, by the user terminal.
Type: Application
Filed: Jul 18, 2019
Publication Date: Jan 13, 2022
Patent Grant number: 12176929
Applicant: Argosdyne Co. Ltd. (Gyeonggi-do)
Inventors: Seung Ho Jeong (Gyeonggi-do), Seung Hyun Jung (Gyeonggi-do)
Application Number: 17/413,323