METHOD AND SYSTEM FOR SERVICING AN OBJECT
A method for operating a mobile platform includes determining one or more service points based on a model of an object and directing the mobile platform to perform one or more tasks at the determined service points.
This application is a continuation of application Ser. No. 16/008,706, filed on Jun. 14, 2018, which is a continuation of International Application No. PCT/CN2015/097514, filed on Dec. 15, 2015, the entire contents of both of which are incorporated herein by reference.
COPYRIGHT NOTICEA portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
FIELDThe disclosed embodiments relate generally to mobile platform technology and more particularly, but not exclusively, to methods and systems for object servicing.
BACKGROUNDObject servicing such as surface inspection is essential for routine maintenance of a wide variety of objects, such as passenger planes, ships, and buildings. For example, surface paint layer of a passenger plane can be subject to damage due to environmental factors such as salt fog, sand dust, lightning strike and bombardment of foreign objects. The damage can result in defects such as chipping or cracking of the surface paint layer. The defect can be located at any location on the fuselage of the passenger plane.
Conventionally, inspection of the defects mainly relies on close-distance visual inspection by a human with the aid of a lift. The visual inspection by the human often needs to traverse the entire fuselage of the passenger plane. Therefore, the inspection often consumes a significant amount of human labor and time.
In view of the foregoing, there is a need for methods and systems for inspecting an object that overcome the disadvantages of currently-available methods and systems.
SUMMARY OF THE DISCLOSUREThe present disclosure relates to a system for servicing an object and methods for making and using same.
In accordance with a first aspect disclosed herein, there is set forth a method for servicing an object via a mobile platform, including:
maintaining a distance between the mobile platform and the object; and
performing a task for the object during the maintaining.
In some embodiments of the disclosed method, the maintaining includes maintaining the distance between the mobile platform and a surface of the object.
In some embodiments of the disclosed method, the maintaining further includes maintaining the distance within a predetermined range of distances.
In some embodiments of the disclosed method, the maintaining further includes maintaining the distance based on operation conditions of the mobile platform.
In some embodiments of the disclosed method, the method further includes monitoring the distance in one or more directions relative to the mobile platform.
In some embodiments of the disclosed method, the method further includes monitoring the distance via at least one of ultrasound sensing and visual sensing.
In some embodiments of the disclosed method, the monitoring the distance includes:
measuring the distance via the ultrasound sensing to obtain a first measurement;
measuring the distance via the visual sensing to obtain a second measurement; and
determining the distance based on at least one of the first measurement and the second measurement.
In some embodiments of the disclosed method, the determining includes determining the distance based on the first measurement when the second measurement detects no texture of interest on the object.
In some embodiments of the disclosed method, the determining includes determining the distance based on the second measurement when the first measurement receives an ultrasound echo having an intensity below a predetermined intensity limit.
In some embodiments of the disclosed method, the method further includes moving toward a selected service point of the object.
In some embodiments of the disclosed method, the method further includes arriving at the selected service point of the object.
In some embodiments of the disclosed method, the method further includes obtaining a position of the selected service point based on:
a spatial relationship between the selected service point and the object; and
a position and an orientation of the object.
In some embodiments of the disclosed method, the obtaining includes obtaining the position of the selected service point based on the position of the object, the position of the object being determined via one or more control points each having a predetermined spatial relationship with the object.
In some embodiments of the disclosed method, the obtaining includes obtaining the position of the selected service point based on the orientation of the object, the orientation of the object being determined via one or more control points each having a predetermined spatial relationship with the object.
In some embodiments of the disclosed method, the method further includes moving along a travel path to move toward the selected service point.
In some embodiments of the disclosed method, the method further includes flying along the travel path.
In some embodiments of the disclosed method, the method further includes determining a position of the mobile platform via a position tracking device onboard the mobile platform.
In some embodiments of the disclosed method, the method further includes determining the position of the mobile platform based on at least one of a Global Positioning System (GPS) signal and a differential Global Positioning System (DGPS) signal.
In some embodiments of the disclosed method, the performing includes performing one or more different tasks.
In some embodiments of the disclosed method, the performing includes performing each of the different tasks using a respective payload on the mobile platform.
In some embodiments of the disclosed method, the performing includes:
capturing an image of the object by using a camera of the mobile platform; and
transmitting the image to a computer associated with the mobile platform for analysis of the object.
In some embodiments of the disclosed method, the method further includes capturing a zoom-in image of a selected feature of the object based on the analysis of the object.
In some embodiments of the disclosed method, the performing includes performing the task in cooperation with one or more other mobile platforms.
In some embodiments of the disclosed method, the performing includes capturing an image of the object that complements one or more images captured by the other mobile platforms.
In some embodiments of the disclosed method, the performing includes performing the task in coordination with the other mobile platforms including an unmanned aerial vehicle (UAV), a ground vehicle, or a combination thereof.
In accordance with another aspect disclosed herein, there is set forth a system for servicing an object, including:
a trip controller for operating onboard a mobile platform and directing the mobile platform to:
maintain a distance between the mobile platform and the object; and
perform a task for the object at the maintained distance.
In some embodiments of the disclosed system, the trip controller directs the mobile platform to maintain the distance between the mobile platform and a surface of the object.
In some embodiments of the disclosed system, the trip controller directs the mobile platform to maintain the distance within a predetermined range of distances.
In some embodiments of the disclosed system, the trip controller directs the mobile platform to maintain the distance being based on operation conditions of the mobile platform.
In some embodiments of the disclosed system, the trip controller monitors the distance in one or more directions relative to the mobile platform via one or more sensors onboard the mobile platform.
In some embodiments of the disclosed system, the trip controller monitors the distance via at least one of an ultrasound sensor and a visual sensor onboard the mobile platform.
In some embodiments of the disclosed system, the trip controller operates to:
measure the distance via the ultrasound sensor to obtain a first measurement,
measure the distance via the visual sensor to obtain a second measurement, and
determine the distance based on at least one of the first measurement and the second measurement.
In some embodiments of the disclosed system, the trip controller operates to determine the distance based on the first measurement when the second measurement detects no texture of interest on the object.
The system of claim 32 or claim 33, wherein the trip controller operates to determine the distance based on the second measurement when the first measurement receives an ultrasound echo having an intensity below a predetermined intensity limit.
In some embodiments of the disclosed system, the trip controller directs the mobile platform to move toward a selected service point of the object.
In some embodiments of the disclosed system, the trip controller directs the mobile platform to arrive at the selected service point of the object.
In some embodiments of the disclosed system, a position of the selected service point is based on:
a spatial relationship between the selected position and the object; and
a position and an orientation of the object.
In some embodiments of the disclosed system, the position of the object is based on one or more control points each having a predetermined spatial relationship with the object.
In some embodiments of the disclosed system, the orientation of the object is based on one or more control points each having a predetermined spatial relationship with the object.
In some embodiments of the disclosed system, the trip controller directs the mobile platform to move along a travel path to approach the selected service point.
In some embodiments of the disclosed system, the trip controller directs the mobile platform to fly along the travel path.
In some embodiments of the disclosed system, the trip controller operates to obtain a position of the mobile platform via a position tracking device onboard the mobile platform.
In some embodiments of the disclosed system, the trip controller operates to obtain the position of the mobile platform based on at least one of a Global Positioning System (GPS) signal and a differential Global Positioning System (DGPS) signal.
In some embodiments of the disclosed system, the trip controller directs the mobile platform to perform one or more different tasks.
In some embodiments of the disclosed system, the mobile platform is adapted to carry a respective payload for performing each of the different tasks.
In some embodiments of the disclosed system, the mobile platform includes a camera for capturing an image of the object, and the mobile platform operates to transmit the captured image to a computer associated with the mobile platform for analysis of the object.
In some embodiments of the disclosed system, the camera operates to capture a zoom-in image of a selected feature of the object based on the analysis of the object.
In some embodiments of the disclosed system, the mobile platform operates to perform the task in cooperation with one or more other mobile platforms.
In some embodiments of the disclosed system, the mobile platform operates to capture an image of the object that complements one or more images captured by the other mobile platforms.
In some embodiments of the disclosed system, the other mobile platforms include an unmanned aerial vehicle (UAV), a ground vehicle, or a combination thereof.
In accordance with another aspect disclosed herein, there is set forth an unmanned aerial vehicle (UAV) for servicing an object, including:
a trip controller that directs the UAV to:
maintain a distance between the UAV and the object; and
perform a task for the object at the maintained distance; and
one or more propellers for moving the UAV based on instructions from the trip controller.
In accordance with another aspect disclosed herein, there is set forth a non-transitory computer-readable storage medium storing one or more programs, the one or more programs including instructions, which when executed by a trip controller, instruct the trip controller to service an object.
In accordance with another aspect disclosed herein, there is set forth a method for operating a mobile platform, including:
determining one or more service points based on a model of an object; and
directing the mobile platform to perform one or more tasks at the determined service points.
In some embodiments of the disclosed method, the method further includes directing the mobile platform to maintain a distance from the object.
In some embodiments of the disclosed method, the determining includes determining a selected service point of the service points in a relative coordinate system based on the model.
In some embodiments of the disclosed method, the directing includes directing the mobile platform to travel based on an absolute coordinate system.
In some embodiments of the disclosed method, the method further includes determining a position of a selected service point of the service points.
In some embodiments of the disclosed method, the method further includes determining coordinates of the selected service point in an absolute coordinate system, wherein the coordinates in the absolute coordinate system are determined based on:
coordinates of the selected service point in a relative coordinate system according to the model; and
a transformation relationship between the absolute coordinate system and the relative coordinate system.
In some embodiments of the disclosed method, the method further includes determining the transformation relationship according to a position and an orientation of the object.
In some embodiments of the disclosed method, the method further includes obtaining the model of the object.
In some embodiments of the disclosed method, the obtaining includes obtaining a three-dimensional model of the object.
In some embodiments of the disclosed method, the method further includes selecting a region of interest of the object based on the model.
In some embodiments of the disclosed method, the method further includes determining a distribution of the service points over the region of interest.
In some embodiments of the disclosed method, the method further includes determining the position of the object based on one or more control points each having each having predetermined relative coordinates.
In some embodiments of the disclosed method, the method further includes determining the orientation of the object based on one or more control points each having predetermined relative coordinates.
In some embodiments of the disclosed method, the method further includes:
determining the position of the object based on a first control point of the control points; and
determining the orientation of the object based on the first control point and a second control point of the control points.
In some embodiments of the disclosed method, the method further includes selecting the control points having a spacing therebetween that is greater than a predetermined spacing limit.
In some embodiments of the disclosed method, the method further includes obtaining coordinates of a selected control point of the control points in an absolute coordinate system by using a position tracking device onboard the mobile platform, wherein the mobile platform is located at the selected control point.
In some embodiments of the disclosed method, the method further includes directing the mobile platform to traverse the service points along a travel path.
In some embodiments of the disclosed method, the method further includes determining a position of the mobile platform via a position tracking device onboard the mobile platform.
In some embodiments of the disclosed method, the method further includes determining a position of the mobile platform based on based on at least one of a Global Positioning System (GPS) signal and a differential Global Positioning System (DGPS) signal.
In some embodiments of the disclosed method, the method further includes directing the mobile platform to capture an image of the object at a selected service point of the service points.
In some embodiments of the disclosed method, the method further includes directing the mobile platform to capture images of the object in one or more perspectives.
In some embodiments of the disclosed method, the method further includes presenting the captured image.
In some embodiments of the disclosed method, the method further includes selecting a feature of the object via feature recognition based on the captured image.
In some embodiments of the disclosed method, the method further includes directing the mobile platform to capture a zoom-in image for the selected feature.
In some embodiments of the disclosed method, the method further includes transmitting the captured image to a computer system of a third party for analysis.
In accordance with another aspect disclosed herein, there is set forth a system for operating a mobile platform, including:
a task manager that operates to:
determine one or more service points based on a model of an object; and
direct the mobile platform to perform one or more tasks at the determined service points.
In some embodiments of the disclosed system, the task manager directs the mobile platform to maintain a distance between the mobile platform and the object.
In some embodiments of the disclosed system, the task manager determines a selected service point of the service points in a relative coordinate system based on the model.
In some embodiments of the disclosed system, the task manager directs the mobile platform to travel based on an absolute coordinate system.
In some embodiments of the disclosed system, the task manager operates to determine a position of a selected service point of the service points.
In some embodiments of the disclosed system, the coordinates in the absolute coordinate system are determined based on:
coordinates of the selected service point in a relative coordinate system according to the model; and
a transformation relationship between the absolute coordinate system and the relative coordinate system.
In some embodiments of the disclosed system, the task manager determines the transformation relationship according to a position and an orientation of the object.
In some embodiments of the disclosed system, the task manager operates to obtain the model of the object.
In some embodiments of the disclosed system, the task manager operates to obtain a three-dimensional model of the object.
In some embodiments of the disclosed system, the task manager operates to select a region of interest of the object based on the model.
In some embodiments of the disclosed system, the task manager operates to determine a distribution of the service points over the region of interest.
In some embodiments of the disclosed system, the task manager operates to determine the position of the object based on one or more control points each having predetermined relative coordinates.
In some embodiments of the disclosed system, the task manager operates to determine the orientation of the object based on one or more control points each having predetermined relative coordinates.
In some embodiments of the disclosed system, the task manager operates to:
determine the position of the object based on a first control point of the control points; and
determine the orientation of the object based on the first control point and a second control point of the control points.
In some embodiments of the disclosed system, the task manager operates to select the control points having a spacing therebetween that is greater than a predetermined spacing limit.
In some embodiments of the disclosed system, the task manager operates obtain coordinates of a selected control point of the control points in an absolute coordinate system by using a position tracking device onboard the mobile platform, wherein the mobile platform is located at the selected control point.
In some embodiments of the disclosed system, the task manager operates to direct the mobile platform to traverse the service points along a travel path.
In some embodiments of the disclosed system, the task manager operates to determine a position of the mobile platform via a position tracking device onboard the mobile platform.
In some embodiments of the disclosed system, the task manager operates to determine a position of the mobile platform based on at least one of a Global Positioning System (GPS) signal and a differential Global Positioning System (DGPS) signal.
In some embodiments of the disclosed system, the task manager operates to direct the mobile platform to capture an image of the object at a selected service point of the service points.
In some embodiments of the disclosed system, the task manager operates to direct the mobile platform to capture images of the object in one or more perspectives.
In some embodiments of the disclosed system, the task manager operates to present the captured image.
In some embodiments of the disclosed system, the task manager operates to select a feature of the object via feature recognition based on the captured image.
In some embodiments of the disclosed system, the task manager operates to direct the mobile platform to capture a zoom-in image for the selected feature.
In some embodiments of the disclosed system, the task manager operates to transmit the captured image to a computer system of a third party for analysis.
In accordance with another aspect disclosed herein, there is set forth a non-transitory computer-readable storage medium storing one or more programs, the one or more programs including instructions, which when executed by a task manager, instruct the task manager to operate a mobile platform.
In accordance with another aspect disclosed herein, there is set forth a system for servicing an object, including:
a mobile platform; and
a task manager for:
determining one or more service points based on a model of an object; and
directing the mobile platform to perform one or more tasks at the determined service points.
It should be noted that the figures are not drawn to scale and that elements of similar structures or functions are generally represented by like reference numerals for illustrative purposes throughout the figures. It also should be noted that the figures are only intended to facilitate the description of the embodiments. The figures do not illustrate every aspect of the described embodiments and do not limit the scope of the present disclosure.
DETAILED DESCRIPTION OF THE EMBODIMENTSSince currently-available methods and systems are incapable of efficiently and automatically inspecting an object, a method and system that improves efficiency in inspecting the object and automating inspection process can prove desirable and provide a basis for a wide range of applications. The applications can include inspection for planes, ships, and buildings. Not necessarily limited to inspection, the method and system can be useful for any industrial work that involves implementing a function at a selected location of an object. This result can be achieved, according to embodiments disclosed herein, by a system 100 as illustrated in
The system 100 can be used for servicing an object 300. The object 300 can include any structure with a shape. Exemplary structures can include planes, ships, space shuttles, buildings, bridges, mountains, caves, tunnels, transportation pipelines, water body, and/or the like. The shape can be fixed and/or can change over time. The object 300 can change shape on its own and/or can change shape due to external forces, including with and/or without human involvement. The object 300 can be stationary and/or moving. For example, the object 300 can make movements that are rotational and/or translational.
The system 100 can service the object 300, for example, by performing a task (and/or taking an action) associated with (and/or involving) the object 300, including any task for benefiting the object 300. Exemplary actions can include inspecting, repairing, supplying maintenance for, and/or obtaining data of the object 300.
As shown in
Although
Turning now to
The sensors 230 can interface with one or more processors 240. The processors 240 can function as a trip controller for directing some or all operations of the mobile platform 200. When the mobile platform 200 includes a UAV, the processors 240 can function as a flight controller. Without limitation, each processor 240 can include one or more general purpose microprocessors (for example, single or multi-core processors), application-specific integrated circuits, application-specific instruction-set processors, graphics processing units, physics processing units, digital signal processing units, coprocessors, network processing units, audio processing units, encryption processing units, and the like. The processors 240 can be configured to perform any of the methods described herein, including but not limited to, a variety of operations relating to object distance detection and collision avoidance. In some embodiments, the processors 240 can include specialized hardware for processing specific operations relating to object distance detection and collision avoidance—for example, processing ultrasound data, processing visual sensor data, determining the distance D (shown in
In some embodiments, the processor 240 can be located in physical proximity to the sensors 230. In such cases, the processor 240 and the sensors 230 can be configured to communicate locally, for example, using hardware connectors and buses. An advantage of local communication is that transmission delay can be reduced to facilitate real-time object distance detection and collision avoidance.
In other embodiments, the processors 240 can be at least partially located remotely from the sensors 230. Measurement data obtained by the sensors 230 can be transmitted to the remote processors 240 for processing. The remote processors 240 can generate a control signal based on the measurement data and send the control signal back to the mobile platform 200. Various wireless communication methods can be used for remote communication between the sensors 230 and the processors 240. Suitable communication methods include, for example, radio, Wireless Fidelity (Wi-Fi), cellular, satellite, and broadcasting.
As shown in
In some embodiments, when being controlled by the mobile platform control system 210 and/or the processors 240, the movement mechanism 220 can permit the mobile platform 200 to make selected movements. Exemplary movements can include taking off from a surface or ground, landing on ground, travelling at a selected velocity, travelling to a selected destination, hovering in air at a selected position and/or orientation, or a combination thereof.
For example, the movement mechanism 220 can include one or more motors each driving one or more propellers. The propellers can thus provide lift and/or thrust to the mobile platform 200. Rotation rate of each of the propellers can be varied independently and/or collectively in order to control the lift and/or thrust produced by each propeller, and thereby adjust spatial disposition, velocity, and/or acceleration of the mobile platform 200. For example, spatial disposition, velocity, and/or acceleration of the mobile platform 200 can be adjusted with respect to up to three degrees of translation and up to three degrees of rotation.
To maintain the distance D, the mobile platform control system 210 can monitor the distance D via one or more of the sensors 230. When the mobile platform control system 210 (shown in
As shown in
In order to maintain the distance D, at 1100, the mobile platform control system 210 can determine the distance D based on measurements by the sensors 230, and compare the distance D with the predetermined distance. The predetermined distance can be determined based on one or more factors, selected from a wide variety of factors. For example, the predetermined distance can be determined and/or dynamically adjusted based on the task that the mobile platform 200 performs, at 1200, and/or based on one or more operation conditions of the mobile platform 200.
Exemplary operating conditions can include environmental conditions surrounding the mobile platform 200. An exemplary environmental condition can include a wind condition. Wind surrounding the mobile platform 200 can make the mobile platform 200 shift position. When the distance D is too short, the mobile platform 200 can easily collide with the object 300 under force of the wind because the mobile platform 200 may not have sufficient time and/or distance to adjust movement to compensate for unintended position shifting.
Additionally and/or alternatively, exemplary operating conditions can include accuracy of control system and/or movement of the mobile platform 200. For example, the sensors 230 can have certain measurement errors and/or tolerances. The mobile platform control system 210 and/or the movement mechanism 220 can also have certain errors. Additionally and/or alternatively, any position tracking device for measuring position of the mobile platform 200 can have certain measurement errors. Thus, actual value of the distance D can deviate from the predetermined value that the distance D needs to be maintained at. When the distance D is too short, the mobile platform 200 may collide with the object 300. Thus, for safe operation, the mobile platform 200 can maintain the distance D to be greater than a predetermined lower distance limit. For example, when the mobile platform 200 is a UAV, an exemplary lower distance limit can be at least 60 cm.
Additionally and/or alternatively, the mobile platform 200 can maintain the distance D in order to perform a certain task optimally. For example, the mobile platform 200 can maintain the distance D to be greater than a lower limit and/or less than an upper limit in order to capture an image of a selected region of the object at a selected size of field of view and/or resolution.
Thus, to avoid collision and/or to perform the task with success, the mobile platform 200 can maintain the distance D to be within a predetermined range of distances. In certain embodiments, the predetermined range of distances can be based on factors such as the operation conditions of the mobile platform 200. Stated somewhat differently, the distance D being maintained between the mobile platform 200 and the object 300 can be based on the operation conditions of the mobile platform 200.
The mobile platform 200 can monitor the distance D, in order to adjust movements for maintaining the distance D. The distance D can include one or more distances D1, D2, . . . , Dn. The distances can be measured in uniform and/or different directions relative to the mobile platform 200. The distances can be measured between the mobile platform 200 and uniform and/or different locations on the object 300. The distances that the mobile platform 200 maintains at different locations on the object 300 can be uniform and/or different. In one example, the mobile platform 200 can be at least partially surrounded by the object 300. Thus, the mobile platform 200 can monitor the distance D in one or more directions relative to the mobile platform 200. Advantageously, the mobile platform 200 can avoid collision in one or more directions.
Turning to
Although two directions are shown in
The mobile platform 200 can monitor the distance D in any manners. Turning to
Although
Although shown as distinct devices for illustrative purposes only, two or more of the sensors 230 of the mobile platform control system 210 can be integrated partially or fully into a single device, and/or share one or more overlapping physical components such as a housing, microchips, photosensors, detectors, communications ports, and the like. For example, in some embodiments, the ultrasound sensor 231 can share the processor 240 with the visual sensor 232. In other embodiments, the sensors 230 can be physically discrete devices for ease of replacement and modularity.
The visual sensor 232 can ascertain the distance D via visual sensing. Visual sensing can refer to sensing by analyzing one or more images of an object of interest. Exemplary visual sensing method can include stereopsis (and/or stereoscopy). The visual sensor 232 can include two or more imaging devices, each capturing an image of a point of interest on the object 300. A difference between locations of the point of interest in the images can be used for providing the distance D.
Referring now to
Referring now to
where cx and cy represent respective center coordinates of the imaging devices 232a and 232b, xi and yi represent the coordinates of the object 150 of interest in one or both of the images 2321a and 2321b, b is the baseline (in other words, the distance between the center coordinates of the imaging devices 232a and 232b), f is the focal length of each imaging devices 232a and 232b (assuming here that the imaging devices have the same focal length), i is an index over multiple feature points 301 of interest of the object 300, and d is the binocular disparity between the images 2321a and 2321b, represented here as:
di=xia−xib Equation (4)
Based on the images 2321a and 2321b and using Equations (3-(4), the visual sensor 232 can determine the coordinate zi. The distance D can be equal to, and/or based on, magnitude of the coordinate zi.
Measurement by the visual sensor 232 can include identifying and/or selecting feature points 301 of the object 300 in the images 2321a and 2321b. The feature points 301 can be identified based on machine vision and/or artificial intelligence methods, and the like. Suitable methods include feature detection, extraction and/or matching techniques such as RANSAC (RANdom SAmple Consensus), Shi & Tomasi corner detection, SURF blob (Speeded Up Robust Features) detection, MSER blob (Maximally Stable Extremal Regions) detection, SURF (Speeded Up Robust Features) descriptors, SIFT (Scale-Invariant Feature Transform) descriptors, FREAK (Fast REtinA Keypoint) descriptors, BRISK (Binary Robust Invariant Scalable Keypoints) descriptors, HOG (Histogram of Oriented Gradients) descriptors, and the like.
When the distance D is less than and/or equal to 5 meters, the error of distance measurement by an exemplary visual sensor 232 can range from 1 cm to 50 cm. The visual sensor 232 can identify the feature points 301 when the object 300 has visible texture. If the object 300 does not have sufficient visible texture, the visual sensor 232 cannot easily identify the feature points 301 and/or cannot identify a large number of feature points 301. The error of measuring the distance D can increase. Exemplary objects 300 that do not have sufficient visible texture can include single-colored wall, smooth glass, and/or the like.
Turning now to
In certain embodiments, when the distance D is less than and/or equal to 5 meters, error of distance measurement by an exemplary ultrasound sensor 231 can range from 20 cm to 50 cm. For the object 300 that does not have sufficient visible texture, such as single-colored wall and/or smooth glass, the ultrasound sensor 231 can measure the distance D with high accuracy. However, for objects 300 that absorb and/or mitigate ultrasound wave, the ultrasound sensor 231 can measure the distance D with low accuracy. Exemplary objects 300 that absorb and/or mitigate ultrasound wave include certain type(s) of carpets.
Turning to
Although
Although
Turning to
When installed on the mobile platform 200, the sensor sets 230A can include the sensors 230 (shown in
Turning to
The distance D can be determined by combining results of the first measurement and the second measurement in any predetermined manner. For example, the distance D can be determined to be a simple average and/or a weighted average of the first measurement and the second measurement.
For certain objects, the first measurement or the second measurement may have particularly limited accuracy. Thus, in some examples, based on characteristics of the object 300 and/or the measurement process, the distance D can be based on one of the first measurement and the second measurement.
For example, during the second measurement by the visual sensor 232, the visual sensor 232 can detect no texture of interest on the object 300. When the object 300 has the texture of interest, the visual sensor 232 can identify one or more feature points 301 in a captured image of the object 300. The number of the feature points 301 can be equal to and/or greater than a predetermined number limit. Reaching the predetermined number limit can ensure the second measurement to have a desired accuracy. Therefore, when the second measurement by the visual sensor 232 detects no texture of interest on the object 300, the distance D can be determined based on the first measurement by the ultrasound sensor 231.
In another example, during the first measurement by the ultrasound sensor 231, the ultrasound receiver 231b (shown in
In some examples, performing a task for a selected position on the object 300 can be desirable. Turning to
Additionally and/or alternatively, the object 300 can move toward the service point 310 for performing the task. However, the object 300 need not necessarily arrive at the service point 310. For example, when the service point 310 is on the surface of the object 300, the mobile platform 200 can approach (or move to be near) the service point 310 while maintaining the distance D (shown in
The mobile platform 200 can locate the service point 310 according to a position of the service point 310. The position of the service point 310 can include coordinates of the service point 310 in an absolute coordinate system (or a global coordinate system). The absolute coordinate system can be a system based on which travel of the mobile platform 200 can be directed. Stated somewhat differently, the control system 210 (shown in
The object 300 can define a relative coordinate system (or a local coordinate system). The relative coordinate system can be used for defining a spatial relationship between the selected service point 310 and the object 300. For illustrative purposes,
A given service point 310 can be defined and/or selected based on coordinates of service point 310 in the relative coordinate system, regardless of what absolute coordinate system is used. For example, a selected service point 310 can be a tip of a right wing of the object 300 when the object 300 includes an aircraft. Such selection can be made in the relative coordinate system based on the object 300.
For a given service point 310, the coordinates (x2, y2, z2) in the relative coordinate system can be fixed regardless of how the object 300 is placed in the absolute coordinate system. The coordinates (x1, y1, z1) in the absolute coordinate system can vary depending on how the object 300 is placed in the absolute coordinate system. Coordinates in the relative coordinate system and the absolute coordinate system can be referred to as relative coordinates and absolute coordinates, respectively.
Turning to both
Correspondingly, the relative coordinate system (or the three coordinate axes X2, Y2, Z2) in
Therefore, in
For the selected service point 310 having coordinates (x2, y2, z2) in the relative coordinate system, the coordinates (x1, y1, z1) in the absolute coordinate system can be obtained based on a transformation relationship between the absolute and the relative coordinate systems. Stated somewhat differently, for the given service point 310, the transformation relationship is a relationship between the absolute coordinates and the relative coordinates. The absolute coordinates (x1, y1, z1) can be obtained based on the relative coordinates (x2, y2, z2) when the transformation relationship is known.
An exemplary transformation relationship can include the following expressions:
where a1, b1, c1, a2, b2, c2, a3, b3, and c3 are transformation coefficients for transformation relationship between the absolute and the relative coordinate systems. The transformation relationship can be fixed when the object 300 is stationary, and/or can change when the object 300 moves.
In some embodiments, when the position and orientation of the object 300 in the absolute coordinate system is determined, the transformation relationship can be determined. Therefore, the coordinates (x1, y1, z1) can be based on the position and orientation of the object 300 in the absolute coordinate system.
The transformation relationship can be determined in any manner. For example, the transformation relationship can be determined using a method that can provide values of the transformation coefficients. In certain examples, the transformation relationship can be determined based on one or more control points. Stated somewhat differently, the position and orientation of the object 300 can be based on one or more control points. A control point can refer to a point having one or more coordinates that are at least partially known so as to allow calculation of the transformation relationship.
Turning to
The position of the object 300 can be expressed via the absolute coordinates (x1a, y1a, x1a). When the position of the object 300 is determined, the mobile platform 200 can locate the object 300. However, for the mobile platform 200 to locate the selected service point 310, the orientation of the object 300 may need to be determined.
An orientation of the object 300 can be determined via one or more control points. Turning to
An orientation 302 of the object 300 can include orientations 302B, 302C. The orientation 302B, 302C can be vectors. The orientation 302B can be expressed by relative coordinates (x2b-x2a, y2b-y2a, z2b-z2a) and/or absolute coordinates (x1b-x1a, y1b-y1a, z1b-z1a). The orientation 302C can be expressed by relative coordinates (x2c-x2a, y2c-y2a, z2c-z2a) and/or absolute coordinates (x1c-x1a, y1c-y1a, z1c-z1a). Therefore, via the control points 330a-330c, how the object 300 orients in the absolute coordinate system can be determined.
Based on the position and/or the orientation of the object in the absolute coordinate system, the mobile platform 200 can locate the selected service point 310.
Stated somewhat differently, the transformation relationship between the absolute and relative coordinate systems can be obtained. For example, via Equation (5), each of the control points 330a at (x1a, y1a, x1a) and (x2a, y2a, x2a), 330b at (x1b, y1b, x1b) and (x2b, y2b, x2b), and 330c at (x1c, y1c, x1c) and (x2c, y2c, x2c) can provide three equations. Nine equations can be used for solving the transformation coefficients a1, b1, c1, a2, b2, c2, a3, b3, and c3. The transformation relationship can thus be obtained.
Turning to
As shown in
For example, the transformation relationship can have the following expressions:
Therefore, via the control points 330a, 330b, the position and orientation of the aircraft 300A in the absolute coordinate system can be determined.
Based on the position and/or the orientation of the object in the absolute coordinate system, the mobile platform 200 (shown in
Stated somewhat differently, the transformation relationship between the absolute and the relative coordinate systems can be obtained. For example, via Equation (6), the control points 330a at (x1a, y1a) and (x2a, y2a), 330b at (x1b, y1b) and (x2b, y2b) can used for solving the transformation coefficients a1, b1, a2, and b2. The transformation relationship can thus be obtained.
Turning to
The model can refer to a replica of the object 300 that partially and/or completely duplicates the shape, size and/or dimension of the object 300. An exemplary model can include a computerized two-dimensional and/or three-dimensional model.
The position and the orientation of the object 300 can be obtained, at 1122. For example, the position can be obtained via the control point 330a (shown in
The control points 330a-330c can have respective predetermined spatial relationships with the object 300. The predetermined spatial relationships can be expressed using relative coordinates, which can be obtained based on the model of the object 300.
Additionally and/or alternatively, the control points 330a-330c can have respective absolute coordinates, which can be obtained via any navigation methods. An exemplary navigation method can include Global Positioning System (GPS). For example, a position tracking device (such as a GPS device) can be positioned at a selected control point 310 and measure the absolute coordinates. The position tracking device can be placed at the selected control point by any methods and/or tools, such as human operator, a mechanical arm, and/or a lift.
To improve accuracy of measuring the absolute coordinates with the position tracking device, spacing between control points and/or service points can be greater than a predetermined spacing limit. For example, the L1 and/or L2 (shown in
The position of the selected service point 310 can be determined, at 1123, based on the spatial relationship between the selected service point 310 and the object 300, and/or the transformation relationship between the relative and absolute coordinate systems.
Although Cartesian coordinate system is used for illustrating the absolute and relative coordinate systems, the absolute and/or relative coordinate systems can include any uniform and/or different types of coordinate systems. Exemplary coordinate systems can include cylindrical coordinate system, spherical coordinate system, and/or geographic coordinate system.
Although
Turning to
The mobile platform 200 (shown in
Turning to
When the position tracking device 280 includes the DGPS module, the position tracking device 280 can operate to communicate with a base station 600. The base station 600 can include a GPS receiver set up on a precisely known geographic location. For example, in the U.S. and Canada, the base station 600 can include a DGPS base station system operated by the United States Coast Guard (USCG) and Canadian Coast Guard (CCG). The DGPS base station system can operate to communicate with the position tracking device 280 on longwave radio frequencies between 285 kHz and 325 kHz near major waterways and harbors.
The base station 600 can calculate its own position based on satellite signals and compare the calculated location to the known location. A difference between the calculated location and the known location can provide a differential correction. The base station 600 can thus broadcast the differential correction to the position tracking device 280 automatically and/or in response to the position tracking device 280.
The base station 600 can communicate with the position tracking device 280 via any wired and/or wireless communication methods. Exemplary method can include radio frequency (RF) communication.
When the position tracking device 280 measures the position of the position tracking device 280 and/or the mobile platform 200, the position tracking device 280 can apply the differential correction to correct GPS data recorded by the position tracking device 280. The GPS data after correction are thus DPGS signals. After being corrected, GPS data of the position of the mobile platform 200 can have high accuracy. For example, without being corrected, a GPS can measure the position of the mobile platform 200 with an error of about 2 meters. When being corrected, the position tracking device 280 can measure the position of the mobile platform 200 with an error as small as 10 cm. Therefore, the mobile platform 200 can move to selected positions with high accuracy.
Although
Turning to
In certain embodiments, the base station 600 (shown in
Turning to
Optionally, the payload 290 can be coupled with the mobile platform 200 via a carrier 292 as shown in
Optionally, the carrier 292 can stabilize the payload 290 against motion of the mobile platform 200. For example, the carrier 292 can provide damping function to reduce vibration of the payload 290 due to motion of the mobile platform 200. Additionally and/or alternatively, the carrier 292 can include one or more sensors for measuring unwanted movement of the payload 290 due to position drifting of the mobile platform 200. The carrier 292 can thus compensate for the movement by adjusting orientation and/or position of the payload 290. Thus, even with vibration, shaking and/or drifting of the mobile platform 200, the payload 200 can remain stable and perform the task for the selected service point 310 with higher positioning accuracy.
Although
The mobile platform 200 can perform one or more uniform and/or different tasks by using the payloads 290. In certain embodiments, the mobile platform 200 can perform each of the uniform and/or different tasks by using a respective payload 290. Additionally and/or alternatively, a payload 290 can perform one or more uniform and/or different tasks. Additionally and/or alternatively, multiple payloads 290 can collaboratively perform a task.
Turning to
The carrier 292 can include a gimbal 292A. In certain embodiments, the gimbal 292A can include a three-axis gimbal. The mobile platform 200 can control the gimbal 292A to adjust orientation of the camera 290A. Thus, when the mobile platform 200 suspends in air with orientation fixed, the camera 290A can be oriented in any of the plurality of directions to capture images of the object 300 in a plurality of perspectives.
Turning to
When performing the task using the payload 290, the mobile platform 200 can obtain data 204. For example, when the payload 290 includes the camera 290A (shown in
Additionally and/or alternatively, when the payload 290 includes an instrument for measuring a property of the object 300, the data 204 can thus include the measurement result obtained by the instrument. Exemplary measurement can include directing light to the surface of the object 300 to measure a property, such as optical reflectance, of the surface.
The mobile platform 200 can provide the data 204 to the task manager 400. The task manager 400 can process and/or analyze the data 204. Based on the processing and/or analysis, the task manager 400 can determine subsequent task for the mobile platform 200 to perform, and generate a command 206 accordingly. An exemplary command 206 can instruct the mobile platform 200 to update the task and perform the updated task. Additionally and/or alternatively, the exemplary command 206 can instruct the mobile platform 200 to adjust orientation of the payload 290 (shown in
Turning to
The image can be transmitted, at 1220, to the task manager 400 (shown in
For example, the task manager 400 can present and/or display the image for an operator to visually inspect the object 300. Additionally and/or alternatively, the task manager 400 can transmit the image to a computer system of a third party for analysis, inspection, and/or storage.
Additionally and/or alternatively, the task manager 400 can perform one or more image feature recognition (and/or feature detection) methods. A feature can refer to a part of interest in the image. An exemplary feature can include edge, corner, point, and/or ridge. The feature can have a selected characteristic, such as shape, size, dimension, color, texture, and/or the like. Selection of the feature can depend on type of task that the mobile platform 200 is configured to perform.
The feature recognition method can determine whether a feature of a selected type exists at a point in the image. An exemplary feature recognition method can include edge detection, and/or image binarization.
The task manager 400 can generate the command 206 (in
By using the disclosed methods, automation and efficiency of servicing the object 300 can advantageously be improved. For example, to inspect the surface of the object 300, the mobile platform 200 can repeatedly return to a selected position of the object 300 to conduct repeated inspection, even when position and/or orientation of the object 300 can be different at every time of inspection.
Additionally and/or alternatively, the mobile platform 200 can advantageously travel close to the object 300 and avoid collision with object 300. The mobile platform 200 can thus capture images of the object 300 at a close distance. The images can thus present features and/or topography of the surface of the object 300 at a sufficiently high clarity and/or resolution. The surface can thus be inspected by analyzing the images.
Additionally and/or alternatively, the mobile platform 200 can transmit the images to the task manager 400 for analysis. The task manager 400 can be located remotely from the object 300. The image can be analyzed in real time when the mobile platform 200 captures the images. Additionally and/or alternatively, feedback based on the image can be generated in real time, and the task manager 400 can generate the command 206 according to the feedback. The mobile platform 200 can thus operate based on the command 206, to timely obtain new images with adjusted content and/or quality.
Additionally and/or alternatively, the real-time analysis and/or feedback can be achieved with and/or without human intervention or involvement. Even if a human analyzes the image, the human does not need to be present at a position of the object 300. Human labor cost and time can thus be reduced.
Turning to
The mobile platforms 200A-200C can perform the task(s) in cooperation with each other. That is, each of mobile platforms 200A-200C can perform at least part of a task, in order to collectively perform the task. Additionally and/or alternatively, each of mobile platforms 200A-200C can perform one or more of a plurality of tasks in order to collectively perform the plurality of tasks.
For example, the mobile platform 200A can capture an image of a skyward surface of the object 300 when the object 300 is on the ground. The mobile platform 200C can travel between the object 300 and the ground, to capture an image of a groundward surface of the object 300. Thus, by working in cooperation with each other, the mobile platforms 200A and 200C can respectively capture the images that present different portions of the object 300 and complement each other.
The plurality of mobile platforms 200A-200C can perform the task for the object 300 simultaneously and/or sequentially. The plurality of mobile platforms 200A-200C can be uniform and/or different. Exemplary mobile platforms can include a UAV and/or a ground vehicle. Although
Turning to
The terminal device 500 can be located distally from the mobile platform 200 during operation of the mobile platform 200 to service the object 300. The terminal device 500 can provide instructions to the mobile platform 200 for implementing functions as described throughout the present disclosure.
Turning to
The terminal device 500 can include a memory 520. The memory 520 can include high-speed random access memory (RAM). Additionally and/or alternatively, the memory 520 can include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, and/or other non-volatile solid-state memory devices. Although one memory 520 is shown in
The processor 510 can run (or execute) various software programs and/or sets of instructions stored in the memory 520 to perform various functions for the terminal device 500.
The terminal device 500 can include a communication module 530. The communication module 530 can operate to communicate with another computer system, such as the mobile platform control system 210 (shown in
Additionally and/or alternatively, the terminal device 500 can include a display system 540. In certain embodiments, the display system 540 can include a touch-sensitive display, which can also be called a “touch screen.” The display system 540 can display information stored on the terminal device 500 for presentation.
Turning to
The terminal device 500 can communicate with the mobile platform 200 via the remote control 502. That is, the mobile platform 200 can transmit data to the remote control 502. The remote control 502 can transfer the data to the terminal device 500. On the other hand, the terminal device 500 can transfer instructions to the remote control 502. The remote control 502 can transmit the instructions to the mobile platform 200. Stated somewhat differently, the remote control 502 can operate as a data exchange channel to route communication between the terminal device 500 and the mobile platform 200.
One or more service points 310 are determined, at 2100, based on a model of the object 300. The terminal device 500 can determine the service points 310 (shown in
For example, the object 300 can include an aircraft, such as a passenger plane. Thus, the model can include three-dimensional information of the plane. Exemplary three-dimensional information can include length, width and/or height of the plane body, slope angle of front surface of the plane, tilt angle of root and/or tip of a wing, a tilt angle between the root and tip of the wing, width of horizontal tail wing and/or horizontal tail wing, or a combination thereof.
The mobile platform 200 is directed, at 2200, to perform one or more tasks at the identified service points 310. The terminal device 500 can direct the mobile platform 200 to perform the tasks. For example, the terminal device 500 can direct the mobile platform 200 to capture an image of the object 300.
Additionally and/or alternatively, the terminal device 500 can direct the mobile platform 200 to maintain the distance D (shown in
Turning to
The model of the object 300 is obtained, at 2110. The terminal device 500 can obtain the model from any sources. For example, a manufacturer and/or builder of the object 300 can provide the model. The object 300 can be measured by certain human/machine to generate the model. When the shape, size and/or dimension of the object 300 changes, the model can change correspondingly.
The terminal device 500 can obtain the model from another computer system and/or a model library. Additionally and/or alternatively, the terminal device 500 can previously store the model in the memory 520 and retrieve the model from the memory 520.
A region 320 of interest is selected, at 2120, based on the model. The terminal device 500 can select the region 320 (shown in
The terminal device 500 can select the region 320 autonomously based on predetermined selection criteria, and/or select the region 320 based on selection instruction inputted into the terminal device 500.
A distribution of the service points 310 over the region 320 is determined, at 2130. The distribution of the service points 310 can refer to a density of the service points 310, and/or spatial relationship between the service points 310 and the object 300. The terminal device 500 can determine the distribution of the service points 310 based on any criteria.
For example, when the mobile platform 200 is positioned at the distance D from a service point 310, the payload 290 can perform the task for a sub-region in the region 320. Size and/or shape of the sub-region can be based on the distance D and/or property of the payload 290 (shown in
In order to perform the task for the region 320 entirely, the terminal device 500 can determine the density based on size and/or shape of the sub-region and the region 320. The terminal device 500 can determine and/or select the service points 310 according to the determined density. Additionally and/or alternatively, the terminal device 500 can determine the travel path for the mobile platform 200 to traverse the service points 310.
Although the method 2100 can include 2110-2130, one or more of 2110-2130 can be omitted.
Additionally and/or alternatively, the terminal device 500 can determine position of a selected service point 310 (shown in
Additionally and/or alternatively, based on the model, the terminal device 500 can determine spatial relationship between selected control points (such as the control points 330a-330c shown in
To determine the absolute coordinates of the control points, a position tracking device can be positioned at the control point to measure the absolute coordinates of the control points 330. The position tracking device can send the measurement result to the terminal device 500.
In certain embodiments, the position tracking device 280 (shown in
Additionally and/or alternatively, when the mobile platform 200 includes the camera 290A (shown in
Additionally and/or alternatively, the mobile platform 200 can send images to the terminal device 500 for analysis. The terminal device 500 can analyze the images in real time and/or send the images to a third party for analysis and/or storage. Optionally, the display system 540 (shown in
Various embodiments further disclose computer program product comprising instructions for operating the mobile platform 100. The program/software can be stored in a (non-transitory) computer-readable storage medium including, e.g., Read-Only Memory (ROM), Random Access Memory (RAM), internal memory, register, computer hard disk, removable disk, CD-ROM, optical disk, floppy disk, magnetic disk, or the like. The program/software can include coded instructions to instruct one or more processors on a computer device to execute the methods in accordance with various disclosed embodiments. For example, the program/software can be executed on a computer and/or mobile device in the form of an application program, application, and/or app.
The disclosed embodiments are susceptible to various modifications and alternative forms, and specific examples thereof have been shown by way of example in the drawings and are herein described in detail. It should be understood, however, that the disclosed embodiments are not to be limited to the particular forms or methods disclosed, but to the contrary, the disclosed embodiments are to cover all modifications, equivalents, and alternatives.
Claims
1. A method for operating a mobile platform, comprising:
- determining one or more service points based on a model of an object; and
- directing the mobile platform to perform one or more tasks at the determined service points.
2. The method of claim 1, further comprising:
- directing the mobile platform to maintain a distance from the object.
3. The method of claim 1, wherein determining the one or more service points includes:
- determining a selected service point of the determined service points in a relative coordinate system based on the model.
4. The method of claim 1, where in directing the mobile platform to perform the one or more tasks includes:
- directing the mobile platform to travel based on an absolute coordinate system.
5. The method of claim 1, further comprising:
- determining a position of a selected service point of the determined service points.
6. The method of claim 5, further comprising:
- selecting a region of interest of the object based on the model.
7. The method of claim 6, further comprising:
- determining a distribution of the determined service points over the region of interest.
8. The method of claim 5, further comprising:
- determining coordinates of the selected service point in an absolute coordinate system based on: coordinates of the selected service point in a relative coordinate system according to the model; and a transformation relationship between the absolute coordinate system and the relative coordinate system.
9. The method of claim 8, further comprising:
- determining the transformation relationship according to a position and an orientation of the object.
10. The method of claim 9, further comprising:
- determining the position of the object based on one or more control points each having predetermined relative coordinates; or
- determining the orientation of the object based on the one or more control points each having predetermined relative coordinates.
11. The method of claim 10, further comprising:
- determining the position of the object based on a first control point of the control points; or
- determining the orientation of the object based on the first control point and a second control point of the control points.
12. The method of claim 10, further comprising:
- selecting the control points having a spacing therebetween that is greater than a predetermined spacing limit.
13. The method of claim 10, further comprising:
- obtaining coordinates of a selected control point of the control points in an absolute coordinate system by using a position tracking device onboard the mobile platform, wherein the mobile platform is located at the selected control point.
14. The method of claim 1, further comprising:
- directing the mobile platform to traverse the determined service points along a travel path.
15. The method of claim 1, further comprising:
- directing the mobile platform to capture an image of the object at a selected service point of the determined service points.
16. The method of claim 15, further comprising:
- directing the mobile platform to capture the image of the object at the selected service point in more than one perspective.
17. The method of claim 15, further comprising:
- selecting a feature of the object via feature recognition based on the captured image.
18. The method of claim 17, further comprising:
- directing the mobile platform to capture a zoom-in image for the selected feature.
19. The method of claim 18, further comprising:
- transmitting the captured image to a computer system of a third party for analysis.
20. A system for operating a mobile platform, comprising:
- a task manager that operates to: determine one or more service points based on a model of an object; and direct the mobile platform to perform one or more tasks at the determined service points.
Type: Application
Filed: Aug 19, 2020
Publication Date: Dec 3, 2020
Inventors: Yuwei WU (Shenzhen), Mingxi WANG (Shenzhen), Qi ZHOU (Shenzhen), Di WU (Shenzhen)
Application Number: 16/997,341