Optical Sensing a Distance from a Range Sensing Apparatus and Method

A method and a system for determining horizontal velocity of a construction vehicle and a distance from a range sensing apparatus to a surface is provided. In an embodiment, a plurality of video images of the surface generated by a video camera is received, an angular velocity is calculated by video processing, a distance from each of a plurality of laser rangefinders to the surface is measured, and linear horizontal velocity is calculated from angular velocity and distances.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
BACKGROUND OF THE INVENTION

The present invention relates generally to range sensing and more particularly to optical range sensing in road finishing applications. In construction using asphalt and concrete materials (e.g., road finishing, paving, etc.) various systems and methods for sensing the distance to a surface (e.g., a road) have been used.

Contacting and non-contacting systems have been used. Contacting systems suffer in that they are prone to damage and breakage. Prior non-contacting systems are not accurate enough. These systems generally employ an ultrasonic sensing unit to measure the distance from the construction vehicle or sensing unit to the road surface. In some sensing units more than one heterogeneous sensor is used to measure distances to the surface from the sensing unit. These measured distances are averaged to determine an approximate distance between the sensing unit and the surface.

In some cases, these sensing units or construction vehicles include a temperature sensor. An example of a commonly used temperature sensor is a U-shaped metal attachment to the sensing apparatus that extends toward the road surface. The attachment is used to measure known distance and thus determine speed of sound at current temperature.

The prior range sensing units often provide inaccurate measurements and/or inconsistent sensing because the construction vehicle and/or the sensors and sensing unit may be too close or too far away from the road surface. That is, the sensors may not be in their optimal performance range. Also, ultrasonic distance measurement is prone to give a false reading if there is an obstacle in the ultrasonic beam. It may be not clear what object reflected echo it is measuring (target or obstacle). Accordingly, improved systems and methods for range sensing are needed.

BRIEF SUMMARY OF THE INVENTION

The present invention generally provides methods and apparatus for determining a distance from a sensing unit of a construction vehicle to a surface. In one embodiment of the present disclosure, a method for determining the distance includes measuring a distance from each of a plurality of laser rangefinders to the surface, weighting the measured distance from each of the plurality of laser rangefinders to the surface using a weighting factor, and determining a weighted average distance from the plurality of laser rangefinders to the surface based on weighted the measured distances.

In other embodiment of the present disclosure, a method for determining the distance includes measuring a distance from each of a plurality of laser rangefinders to the surface, weighting the measured distance from each of the plurality of laser rangefinders to the surface using a weighting factor, and providing weighted measured distances to a user without averaging the distances.

In an embodiment, the method for determining the distance from a sensing unit of a construction vehicle to a surface also includes transmitting measured distance information to a processor, and determining a two-dimensional velocity and offset of a directional trajectory of the construction vehicle using at least one video camera. In an embodiment, a sensing unit for determining the distance to a surface includes a plurality of laser rangefinders and at least one video camera. The sensing unit includes a housing in which the plurality of laser rangefinders and at least one video camera are installed. The apparatus also includes a memory storing computer program instructions to determine the distance from the sensing unit to the surface. The apparatus also has a processor communicatively coupled with the memory and configured to execute the computer program instructions to calculate a distance to the surface based at least in part on distances measured by the plurality of laser rangefinders.

These and other advantages of the invention will be apparent to those of ordinary skill in the art by reference to the following detailed description and the accompanying drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 depicts a side schematic view of a sensing unit according to embodiments of the present invention;

FIG. 2 depicts a distance measuring system according to an embodiment of the present invention;

FIG. 3 depicts a method for optical sensing.

DETAILED DESCRIPTION

The present invention generally provides for a system and method for improved range sensing in a construction environment. More specifically, the present invention provides more accurate distance determination. In one embodiment, distance determination is achieved using a plurality of laser rangefinders and a video camera in a single sensing unit physically attached to the construction equipment. In another embodiment, distance determination is achieved using plurality of laser rangefinders and a video camera in multiple sensing units that are part of a single measuring system.

In an embodiment, a plurality of laser rangefinders in a sensing unit is used to determine a distance from the sensing unit to a surface. The laser rangefinders are configured in a single housing or structural module, so as to enable more accurate determination of the distance to be measured. Sensing unit sends current distance to the control unit in essentially real-time. If the sensing unit is equipped with a video camera, the sensing unit also sends current velocity to a control unit and transmits real-time visual images to an operator display via the control unit. Operator can steer construction vehicle based on the real-time video to keep tracking curb or other line targets.

In one embodiment a plurality of laser rangefinders (e.g., one sensing unit is installed on the left side of the paver and second one is installed at the right side of the construction vehicle) are used in order to more accurately determine the distance between the sensing unit and the surface. Velocity measurements and distances between the surface and the plurality of rangefinders and velocity measurements assist in controlling the construction vehicle. Displaying on a display real-time visual images captured by one or more video cameras assist operator to control and direct the construction vehicle. Operator can see targets on both sides simultaneously. In such an embodiment, the plurality of laser rangefinders are used to accurately determine this distance through the means of optical emission and reception whereby each laser rangefinder has an influence on a determined distance. That is, a mathematical calculation may be performed based on data obtained by each of the plurality of laser rangefinders to weight the distance between each of the plurality of laser rangefinders in the sensing unit and the surface which results in determining the distance between the sensing unit and the surface with higher accuracy than the calculation of the distance between the sensing unit and the surface based on data obtained by a single laser rangefinder.

In an embodiment, in addition to the plurality of laser rangefinders, at least one video camera is included in (e.g., integrated into and/or coupled with) the sensing unit to generate a plurality of images of the surface for the purpose of determining a two-dimensional velocity and offset in determining a distance to the road surface. Specifically, the video camera is used for two purposes. First is to capture real-time visual images to be propagated to the operator's display, so operator can track its target and steer machine accordingly. The target could be a curb, an edge of previous layer of asphalt, something else.

Second purpose is to determine velocity of horizontal movement. Visual images or frames with constant rate are processed by a processor. The processor compares two consecutive visual images or frames and determines two dimensional offsets in pixels. It is possible to calculate relative (angular) velocity in pixels per seconds based on the knowledge of the frame rate. Using the distance from laser rangefinder it is possible to convert angular velocity to linear velocity in horizontal plane. For illustration purposes, the present disclosure provides an example of the sensing unit containing a single video camera. However, it is to be understood that the sensing unit may be equipped with a plurality of video cameras.

FIG. 1 depicts an exemplary sensing unit 100 according to an embodiment of the present invention. Specifically, FIG. 1 shows a side schematic view of a sensing unit 100 which includes a housing 102, which encloses a plurality of laser rangefinders 104a and 104b to measure a distance between the sensing unit 100 and a surface 110. In one embodiment, each of the plurality of laser rangefinders 104a and 104b is a time of flight laser rangefinder. In other embodiment, each of the plurality of laser rangefinders 104a and 104b is a phase difference laser rangefinder. In yet other embodiment, the plurality of laser rangefinders 104a and 104b is a combination of time of flight laser rangefinders and phase difference laser rangefinders.

Housing 102 also includes a video camera 106. It is to be understood that positional configuration of the plurality of laser rangefinders 104a and 104b within the housing 102 or the sensing unit 100 may vary. It is also to be understood that the number of laser rangefinders 104a and 104b within the housing 102 of the sensing unit 100 may vary.

The video camera 106 is mounted on or at least partially enclosed within the housing 102 of the sensing unit 100 and is directed downward to determine a two-dimensional velocity of the construction vehicle and an offset of a directional trajectory of the construction vehicle. Each image generated by the video camera 106 may have points of laser reflection. The distance to the surface can be determined by triangulation method based on known relative position and distance between laser optical axis and video camera and the pixel offset of laser reflection. It is to be understood that positioning of the video camera 106 relative to the sensing unit 100 may vary as to accommodate various types of construction vehicles. It is to be understood that although FIG. 1 depicts a single video camera 106, sensing unit 100 can include more than one video camera 106.

FIG. 2 depicts a distance measuring system 200 according to an embodiment. The measuring system 200 contains a processor 202, the plurality of laser rangefinders 104a and 104b, video camera 106, an input-output module 210, a memory 204, and storage device 206, and network interface 208. In one embodiment, processor 202 is physically attached to or installed within a housing of the sensing unit 100. In one other embodiment, processor 202 is located remotely from the sensing unit 100 and/or from the construction vehicle while being configured to communicate with the sensing unit 100 and with the construction vehicle.

Memory 204 (e.g., random-access memory (RAM), read-only memory (ROM) with firmware, and the like) contains computer program instruction to be executed by processor 202 to cause the processor 202 perform a method of measuring distance between the sensing unit 100 and the surface 110 as described herein. Storage 206 stores data gathered by the sensing unit 100 and a distance information calculated by processor 202.

The processor 202 controls the overall operation of the sensing unit 100 by executing computer program instructions which define such operation. For example, processor 202 executes computer program instructions to measure the distance between each of the plurality of laser rangefinders 104a and 104b and the surface 110, to weight measured distances, and to determine calculated weighted average distance to the surface. Processor 202 also controls operation of the video camera 106 to assist an operator of the construction vehicle in determining a position of the construction vehicle or a part of the construction vehicle (e.g., paving equipment) respective to the surface 110.

The computer program instructions are stored in the storage device 209 (e.g., computer-readable medium storage device, magnetic disk, database, etc.) and loaded into memory 204 (from a ROM device to a RAM device or from a LAN adapter to a RAM device) when execution of the computer program instructions by the processor 202 is desired. It is to be understood that the computer program instructions may be stored in a compressed, uncompiled and/or encrypted format. The computer program instructions furthermore may include program elements that may be generally useful, such as an operating system, a database management system and device drivers for allowing the processor 202 to interface with other components and devices of the measuring system 200.

Execution of sequences of the computer program instructions causes the measuring system 200 to perform one or more of the method steps described herein, such as those described below with respect to method 300. In alternative embodiments, hard-wired circuitry or integrated circuits may be used in place of, or in combination with, software instructions for implementation of the processes described in the present disclosure. Thus, embodiments of the present invention are not limited to any specific combination of hardware, firmware, and/or software. However, it would be understood by one of ordinary skill in the art that the invention as described herein could be implemented in many different ways using a wide range of programming techniques as well as general purpose hardware sub-systems or dedicated controllers.

In an embodiment, measuring system 200 is a stand-alone sensing unit 100 physically attached to the construction vehicle such that it is capable of measuring the distance between each of the plurality of laser rangefinders 104a and 104b and the surface 110, calculating weighted distances between each of the plurality of laser rangefinders 104a and 104b and the surface 110, calculating the weighted average distance to the surface 110, and transmitting calculated weighted average distance information to a user (e.g., operator of the construction vehicle or automated vehicle control system). Such information may be recorded by the processor 202, stored in storage unit 206, and displayed to users via input-output module 210 in real time. That is, the measuring system 200 may collect and/or send the calculated weighted average distance information to the construction vehicle operator for use during construction operations. In an embodiment, sensing unit 100 of the measuring system 200 may be removable, angleable, and/or otherwise positionable to provide accurate distance information.

In an embodiment, the measuring system 200 includes network interface 208 for communicating with other devices and/or systems via a network (e.g., a Controller Area Network (CAN)). For example, network interface 208 supports data exchange and data transmission between components (two or more sensing units 100) of the measuring system 200. Network interface 208 also supports data exchange and data transmission between multiple measuring systems installed on separate construction vehicles.

One skilled in the art will recognize that an implementation of an actual measuring system could contain other components as well, and that the controller of FIG. 2 is a high level representation of some of the components of such a measuring system for illustrative purposes.

FIG. 3 illustrates the method steps of a method 300 of distance determination using the measuring system 200. The method 300 begins at step 302. In an embodiment, the method 300 may begin upon the sensing unit 100 being set in a certain position respective to the construction vehicle or upon activation of the measuring system by an automated measurement system controlling the construction vehicle or by a human operator of the construction vehicle.

At step 304, a distance between each of the plurality of laser rangefinders 104a and 104b of the sensing unit 100 and the surface 110 is measured. Specifically, as illustrated in FIG. 1, laser rangefinder 104a measures a distance D1 between laser rangefinder 104a and surface 110. Laser rangefinder 104b measures a distance D2 between laser rangefinder 104b and surface 110. Because the surface 110 is practically never strictly horizontal, distances D1 and D2 may differ between them.

In step 306, each of distances D1 and D2 is weighted. It is to be understood that laser rangefinders 104a and 104b may be more or less accurate under certain conditions. In the context of the present disclosure, external factors (e.g., temperature, humidity, fog, precipitation, lighting, vibration of the construction vehicle, vibration of the sensing unit 100, etc.) may affect accuracy of measurement devices. Accordingly, it is preferable to weight the distance between each of the plurality of laser rangefinders 104a and 104b and the surface 110 by applying pre-determined weighting factor for each distance D1 and D2 depending on the nature and the number of factors affecting accuracy of each of the plurality of laser rangefinders 104a and 104b.

For example, positioning of each of the plurality of laser rangefinder 104a and 104b (e.g., laser rangefinder 104a may be located closer to an outer edge of the sensing unit 102 while laser rangefinder 104b may located farther away from the outer edge of the sensing unit 102) may affect accuracy of distance measurement for each laser rangefinder due to difference in natural lighting/shading (angle of light reflection off the surface 110), difference in air temperature around each of the plurality of laser rangefinders 104a and 104b, difference in a vibration rate for different parts of the sensing unit 100, etc. Therefore, the distance measurements conducted by each of the plurality of laser rangefinders 104a and 104b is weighted with the weighting factor that corresponds to each of the plurality of laser rangefinders 104a and 104b. It is to be understood that the weighting factor may be predetermined for each of the plurality of laser rangefinders 104a and 104b based on pre-manufacturing calculations and modeling, post-manufacturing field testing, and calculations performed based on the number of conditions identified during prior distance measurements. It is also to be understood that weighting factors can be dynamically and continuously re-assessed in real-time, under the number of conditions potentially affecting accuracy of each of the plurality of laser rangefinders 104a and 104b, during the operation of the construction vehicle.

At step 308, a weighted average distance between the sensing unit 102 and the surface 110 is determined. In an embodiment, the weighted average distance between the sensing unit 102 and the surface 110 is determined using a formula:

D WA = w 1 D 1 + w 2 D 2 + + w n D n n ;

where w1, w2, and wn are weighting factors for laser rangefinders 104a, 104b, and 104n, respectively; D1, D2, and Dn are distances between laser rangefinders 104a, 104b, and 104n (not shown) and the surface 110, respectively; n is a number of laser rangefinders used to measure the distance to the surface 110. It is to be understood that the weighted average distance to the surface 110 can be calculated in various other ways.

Upon determining the weighted average distance to surface 110 at step 308, method 300 may return control to step 304. That is, as the construction vehicle continues to travel upon its path, a new distance is measured by each of the plurality of laser rangefinders 104a and 104b and the steps of method 300 are repeated. It is to be noted that method 300 is repeated continually in real-time to provide continuous updates of the distance to the surface for use in construction operations. In step 310, the method 300 ends.

In other embodiment of the present disclosure, method 300 of FIG. 3 may be implemented without a step of averaging the distances between the sensing unit 103 and surface 110. For example, for the purposes of ensuring that surface 110 is paved in accordance with provided specifications, at step 308, each of weighted distances D1 and D2 is transmitted to the construction vehicle controls directing the construction vehicle to apply a paving material to surface 110 in accordance with provided specifications.

The foregoing description discloses only particular embodiments of the invention; modifications of the above disclosed methods and apparatus which fall within the scope of the invention will be readily apparent to those of ordinary skill in the art. For instance, it will be understood that, though discussed primarily as a stand-alone unit with one set of inside sensors and one set of outside sensors, any number and/or type of sensors in any suitable arrangement may be used with a corresponding weighting and/or calculating algorithm. Similarly, other components may perform the functions of methods 500 and 700 even when not explicitly discussed.

The foregoing Detailed Description is to be understood as being in every respect illustrative and exemplary, but not restrictive, and the scope of the invention disclosed herein is not to be determined from the Detailed Description, but rather from the claims as interpreted according to the full breadth permitted by the patent laws. It is to be understood that the embodiments shown and described herein are only illustrative of the principles of the present invention and that various modifications may be implemented by those skilled in the art without departing from the scope and spirit of the invention. Those skilled in the art could implement various other feature combinations without departing from the scope and spirit of the invention.

Claims

1. A sensing unit to determine a horizontal velocity of a construction vehicle and a distance from the construction vehicle to a surface, the sensing unit comprising:

a plurality of laser rangefinders; and
a video camera.

2. The sensing unit of claim 1 further comprising:

a memory communicatively coupled with a processor and storing computer program instructions to determine the distance from the construction vehicle to the surface.

3. The sensing unit of claim 1, the sensing unit further comprising:

a housing in which the plurality of laser rangefinders and the at least one video camera are mounted.

4. The sensing unit of claim 1, wherein in operation each of the plurality of laser rangefinders and the at least one video camera are directed to the surface.

5. The sensing unit of claim 4, wherein in operation each of the plurality of laser rangefinders is directed in a direction perpendicular to a direction of travel of the construction vehicle.

6. The sensing unit of claim 4, wherein in operation each of the plurality of laser rangefinders is directed in a direction parallel to a direction of travel of the construction vehicle.

7. The sensing unit of claim 1, wherein in operation the at least one video camera is directed downward to acquire a plurality of images of the surface to determine a two-dimensional velocity of the construction vehicle.

8. The sensing unit of claim 1, wherein the at least one video camera is directed downward to acquire a plurality of images of the surface determine an offset of a directional trajectory of the construction vehicle.

9. The sensing unit of claim 1, wherein each of the plurality of laser rangefinders is a time of flight laser rangefinder.

10. The apparatus of claim 1, wherein each of the plurality of laser rangefinders is a phase difference laser rangefinder.

11. The sensing unit of claim 1, wherein at least one of the plurality of laser rangefinders is the time of flight laser rangefinder and at least one other of the plurality of laser rangefinders is the phase difference laser rangefinder.

12. The sensing unit of claim 1, wherein the processor is coupled to the sensing unit.

13. The sensing unit of claim 1, wherein the processor is coupled to the construction vehicle and is remote from the construction vehicle.

14. A method for determining a distance from a sensing unit of a construction vehicle to a surface comprising:

receive a plurality of video images of the surface, the plurality of video images generated by a video camera;
measuring a distance from each of a plurality of laser rangefinders to the surface;
weighting the measured distance from each of the plurality of laser rangefinders to the surface using at least one weighting factor corresponding to each of the plurality of laser rangefinders; and
determining a weighted average distance from the plurality of laser rangefinders to the surface based on weighted measured distance from each of the plurality of laser rangefinders to the surface.

15. The method of claim 14, further comprising:

transmitting measured distance information to a processor; and
storing the measured distance information in a memory.

16. The method of claim 14, further comprising:

determining a two-dimensional velocity of the construction vehicle using the video camera.

17. The method of claim 14, further comprising:

determining an offset of a directional trajectory of the construction vehicle using the video camera.

18. A storage device storing computer program instructions for controlling a sensing unit of a construction vehicle to measure a distance from the sensing unit to a surface, which, when executed on a processor, cause the processor to perform operations comprising:

receiving a plurality of video images of the surface, the plurality of video images generated by a video camera;
measuring a distance from each of a plurality of laser rangefinders to the surface;
weighting measured distance from each of the plurality of laser rangefinders to the surface using at least one weighting factor corresponding to each of the plurality of laser rangefinders; and
determining a weighted average distance from the plurality of laser rangefinders to the surface based on weighted measured distance from each of the plurality of laser rangefinders to the surface.

19. The storage device of 18, the operations further comprising:

transmitting a measured distance information to a processor; and
storing the measured distance information in a memory.

20. The storage device of 18, the operations further comprising:

determining a two-dimensional velocity of the construction vehicle using the at least one video camera.

21. The storage device of claim 18, the operations further comprising:

determining an offset of a directional trajectory of the construction vehicle using the at least one video camera.

22. An apparatus for determining a distance from a sensing unit of a construction vehicle to a surface comprising:

a processor; and
a memory communicatively coupled with the processor and storing computer program instructions which when executed by the processor, cause the processor to perform operations comprising:
receiving a plurality of video images of the surface, the plurality of video images generated by a video camera;
measuring a distance from each of a plurality of laser rangefinders to the surface; weighting measured distance from each of the plurality of laser rangefinders to the surface using at least one weighting factor corresponding to each of the plurality of laser rangefinders; and
determining a weighted average distance from the plurality of laser rangefinders to the surface based on weighted measured distance from each of the plurality of laser rangefinders to the surface.

23. The apparatus of claim 22, the operations further comprising:

transmitting a measured distance information to a processor; and
storing the measured distance information in a memory.

24. The apparatus of claim 22, the operations further comprising:

determining a two-dimensional velocity of the construction vehicle using the at least one video camera.

25. The apparatus of claim 22, the operations further comprising:

determining an offset of a directional trajectory of the construction vehicle using the at least one video camera.
Patent History
Publication number: 20150330054
Type: Application
Filed: May 16, 2014
Publication Date: Nov 19, 2015
Applicant: Topcon Positioning Systems, Inc. (Livermore, CA)
Inventor: Nikolay V. Khatuntsev (Pleasanton, CA)
Application Number: 14/279,858
Classifications
International Classification: E02F 3/84 (20060101);