DYNAMIC RANGE OF COLOR CAMERA IMAGES SUPERIMPOSED ON SCANNED THREE-DIMENSIONAL GRAY-SCALE IMAGES

A laser scanner scans an object by measuring first and second angles with angle measuring devices, sending light onto an object and capturing the reflected light to determine a distances and gray-scale values to points on the object, capturing a sequence of color images with a color camera at different exposure times, determining 3D coordinates and gray-scale values to points on the object, determining from the sequence of color images an enhanced color image having a higher dynamic range than available from any single color image, and superimposing the enhanced color image on the 3D gray-scale image to obtain an enhanced 3D color image.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application claims the benefit of German Patent Application No. DE102013110580.7, filed on Sep. 24, 2013, and of U.S. Provisional Patent Application No. 61/926,461, filed on Jan. 13, 2014, the contents of both of which are hereby incorporated by reference in their entirety.

BACKGROUND OF THE INVENTION

U.S. Pat. No. 8,705,016 ('016) describes a laser scanner that colors collected scan data by assigning colors obtained from colored images, pixel by pixel, to the scan image. The hardware of the color camera determines the quality, brightness levels, and contrast of the colored three-dimensional (3D) images.

Further described herein is a laser scanner that superimposes colors obtained from a color camera onto 3D gray-scale images obtained from a time-of-flight (TOF) laser scanner.

A TOF scanner is any type of scanner in which the distance to a target point is based on the speed of light in air between the scanner and the target point. Laser scanners are typically used for scanning closed or open spaces such as interior areas of buildings, industrial installations and tunnels. They are used for many purposes, including industrial applications and accident reconstruction applications. A laser scanner can be used to optically scan and measure objects in a volume around the scanner through the acquisition of data points representing objects within the volume. Such data points are obtained by transmitting a beam of light onto the objects and collecting the reflected or scattered light to determine the distance, two-angles (i.e., an azimuth and a zenith angle), and optionally a gray-scale value. This raw scan data is collected, stored and sent to a processor or processors to generate a 3D image representing the scanned area or object. To generate the image, at least three values are collected for each data point. These three values may include the distance and two angles, or may be transformed values, such as the x, y, z coordinates. In an embodiment, a fourth value collected by the 3D laser scanner is a gray-scale value for each point measured. Such a gray-scale value is related to the irradiance of scattered light returning to the scanner.

Angle measuring devices such as angular encoders are used to measure the two angles of rotation about the two axes of rotation. One type of angular encoder includes a disk and one or more readheads. In an embodiment, the disk is affixed to a rotating shaft, and the one or more read heads are affixed to a portion that is stationary with respect to the rotating shaft.

Many contemporary laser scanners also include a camera mounted on the laser scanner for gathering camera digital images of the environment and for presenting the camera digital images to an operator of the laser scanner. By viewing the camera images, the operator of the scanner can determine the field of view (FOV) of the measured volume and adjust settings on the laser scanner to measure over a larger or smaller region of space if the FOV needs adjusting. In addition, the camera digital images may be transmitted to a processor to add color to the scanner image. To generate a color scanner image, at least six values (three positional coordinates such as x, y, z; and red value, green value, blue value or “RGB”) are collected for each data point.

The data collected by a laser scanner is often referred to as point cloud data because the data, which is typically relatively dense, may resemble a cloud. The term point cloud is taken herein to mean a collection of 3D values associated with scanned objects. The point cloud data may be used to produce 3D representations of the scene being scanned.

A single color camera image provides red, green, and blue pixel values each displayed on the final image with a varying degree of color, from zero red, blue, or green to 100 percent red, blue, or green. However, the degree of level of color displayed for a given pixel is generally limited by the need to avoid saturation of pixels throughout the entire camera photosensitive array. In other words, the maximum light received by any pixel in the array determines the maximum exposure time for the entire array. As a result, a colorized scanned image may have bright colors in a portion of the color image but dim colors at other parts of the image. Such an image is said to have relatively low dynamic range because those parts of the color image receiving relatively low light may not show details that would be desirable to see in a final color 3D image.

Accordingly, while existing 3D scanners are suitable for their intended purposes, what is needed is a 3D scanner having certain features of embodiments of the present invention.

BRIEF DESCRIPTION OF THE INVENTION

According to one aspect of the invention, a method is provided for optically scanning and measuring an object with a laser scanner, the method including providing the laser scanner having integral components that include a light emitter, a light receiver, a first angle measuring device, a second angle measuring device, a control and evaluation unit, and a color camera; providing a color display; measuring a first angle with the first angle measuring device; measuring a second angle with the second angle measuring device; emitting with the light emitter an emission light beam; reflecting the emission light beam from the object to produce a reception light beam; receiving with the light receiver the reception light beam and obtaining a first electrical signal in response; determining with the control and evaluation unit distances to the plurality of measuring points on the object based at least in part on the first electrical signals for each of the plurality of measuring points and on a speed of light in air; determining with the control and evaluation unit gray-scale values for the plurality of measuring points; capturing with the color camera a sequence of color images while the color camera is fixed in space, each image of the sequence captured with a different exposure time and having an associated first dynamic range, the color images providing second electrical signals in response; determining with the control and evaluation unit a 3D gray-scale image based at least in part on the first angle, the second angle, the distances to and gray-scale values for the plurality of measuring points on the object; determining with the control and evaluation unit an enhanced color image having an enhanced dynamic range, the enhanced dynamic range being higher than the any of the associated first dynamic ranges, the enhanced color image based at least in part on the second electrical signals; determining with the control and evaluation unit an enhanced 3D color image by superimposing the enhanced color image on the 3D gray-scale image; and displaying the enhanced 3D color image on the color display.

BRIEF DESCRIPTION OF THE DRAWINGS

The subject matter, which is regarded as the invention, is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features, and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:

FIG. 1 is a schematic illustration of the optical, mechanical, and electrical components of the laser scanner;

FIG. 2 is a schematic illustration of the laser scanner in operation;

FIG. 3 is a perspective drawing of the laser scanner; and

FIG. 4 is a flowchart of a method according to an embodiment.

The detailed description explains embodiments of the invention, together with advantages and features, by way of example with reference to the drawings.

DETAILED DESCRIPTION OF THE INVENTION

A laser scanner 10 is described in reference to FIGS. 1-3. The laser scanner 10 is provided as a device for optically scanning and measuring an environment of the laser scanner 10. The laser scanner 10 has a measuring head 12 and a base 14. The measuring head 12 is mounted on the base 14 as a unit that can be rotated about a vertical axis. The measuring head 12 has a mirror 16, which can be rotated about a horizontal axis. The intersection point of the two axes of rotation is designated center C10 of the laser scanner 10.

The measuring head 12 is further provided with a light emitter 17 for emitting an emission light beam 18. The emission light beam 18 is preferably a laser beam in the range of approx. 300 to 1600 nm wave length, for example 1550 nm, 905 nm, 790 nm or less than 400 nm, on principle, also other electro-magnetic waves having, for example, a greater wave length can be used, however. The emission light beam 18 is amplitude-modulated with a modulation signal. The emission light beam 18 is emitted by the light emitter 17 onto the rotary mirror 16, where it is deflected and emitted to the environment. A reception light beam 20 which is reflected in the environment by an object O or scattered otherwise, is captured again by the rotary mirror 16, deflected and directed onto a light receiver 21. The direction of the emission light beam 18 and of the reception light beam 20 results from the angular positions of the rotary mirror 16 and the measuring head 12, which depend on the positions of their corresponding rotary drives which, in turn, are registered by one encoder each.

A control and evaluation unit 22 has a data connection to the light emitter 17 and to the light receiver 21 in measuring head 12, whereby parts of it can be arranged also outside the measuring head 12, for example as a computer connected to the base 14. The control and evaluation unit 22 is configured to determine, for a multitude of measuring points X, the distance d between the laser scanner 10 and the (illuminated point at) object O, from the propagation time of emission light beam 18 and reception light beam 20. For this purpose, the phase shift between the two light beams 18, 20 can be determined and evaluated, for example.

A display unit 24 is connected to the control and evaluation unit 22. The display unit 24 in the present case is a display at the laser scanner 10; alternatively it can, however, also be the display of a computer which is connected to the base 14.

Scanning takes place along a circle by means of the relatively quick rotation of the mirror 16. By virtue of the relatively slow rotation of the measuring head 12 relative to the base 14, the whole space is scanned step by step, by way of the circles. The entirety of measuring points X of such a measurement defines a scan. For such a scan, the center C10 of the laser scanner 10 defines the origin of the local stationary reference system. The base 14 rests in this local stationary reference system.

In addition to the distance d to the center C10 of the laser scanner 10, each measuring point X comprises brightness information which is determined by the control and evaluation unit 22 as well. The brightness value is a gray-scale value which is determined, for example, by integration of the bandpass-filtered and amplified signal of the light receiver 21 over a measuring period which is assigned to the measuring point X. through use of a color camera, images can be generated optionally, by which colors (R,G,B) can be assigned to the measuring points as values.

The laser scanner 10 is provided with a color camera 25, which is connected to the control and evaluation unit 22 as well. The color camera 25 is configured, for example, as a CCD camera or a CMOS camera and provides a signal which is three-dimensional in color space, preferably an RGB signal, for an image which is two-dimensional in position space. The control and evaluation unit 22 concatenates the scan (which is three-dimensional in position space) of the laser scanner with the images (which are two-dimensional in position space) of the color camera 25, such concatenating being denoted “mapping.” Concatenating takes place image by image for each of the captured color images so as to assign, as a final result, a color (in RGB share) to each measuring point X of the scan; that is, to color the scan.

The light receiver 21 usually is configured such that it doesn't receive the reception light beam 20 coming from the mirror 16 directly, but that the mirror 16 deflects the reception light beam 20 to receiver optics 30. Through use of the optical components, particularly the lenses and/or mirrors, the receiver optics 30 forms on the light receiver 21 an image of the reception light beam 20, coming from the mirror 16. As a 45° sectional area of a cylinder, the mirror 16 has a small semiaxis which defines the diameter of the reception light beam 20. The receiver optics 30 is provided with a reception lens 32, the diameter of which is at least as big as the small semiaxis of the mirror 16, so that it can completely receive the reception light beam 30 and project it onto the next optical element. The optical axis of the reception lens 32 is aligned to the mirror 16. The receiver optics 30 reduces the diameter of the reception light beam 20 to the dimension of the light receiver.

The direction of the reception light beam 20 passed to the light receiver is also the direction into the color camera 25, which may be arranged behind the receiver optics 30 or within the receiver optics 30. A preferred arrangement of the color camera 25 is, however, disclosed in the aforementioned U.S. Pat. No. 8,705,016. With regard to the direction of the reception light beam 20, the color camera 25 is arranged in front of the receiver optics 30. In other words, the light receiver 21 and the color camera 25 jointly use the mirror 16, but the receiver optics 30 is used only by the light receiver 21.

An arrangement of the color camera 25 on the optical axis of the receiver lens 32 has the advantage of keeping aberrations at a low level; i.e., the receiver optics 30 and the color camera 25 view the same section of the environment. The color camera 25 can—with regard to the direction of the reception light beam 20—be directly on the receiver lens 32. The emission light beam 18 of the light emitter 17 can then be deflected, for example by a semitransparent mirror, to the optical axis of the receiver lens 32, to further hit the mirror 16. Alternatively, the color camera 25 can receive the reception light beam 20 at least partially, by a semitransparent mirror. The space directly on the receiver lens 32 can then be taken by the light emitter 17.

Preferably, the light emitter 17 and the color camera 25 are in operation consecutively. In an embodiment, the laser scanner 10, having its color camera 25 switched off, first scans the environment by the emission light beam 18 and receives the reception light beam 20, wherefrom a gray-scale scan is generated. It then captures the color images of the environment, with the light emitter 17 switched off, by the color camera 15. The control and evaluation unit 22 assigns colors to the measuring points X to color the gray-scale scan.

The color camera 25 can increase the contrast of its images. For this purpose, the color camera 25 captures a sequence of images with a low dynamic range (LDR). In this context, the term dynamic range refers to a ratio of the maximum voltage level produced by any pixel to the minimum voltage level produced by any pixel. The dynamic range may be described by other equivalent measures such as the ratio of the maximum number of electrons within any one pixel well to the minimum number of electronics in any pixel well. Somewhat less precisely, the dynamic range may be considered a level of lightness or brightness of light at a point captured by a pixel as seen by the human eye for each of the colors R, G, B.

In an embodiment, multiple images are obtained with the scanner color camera 25 receiving light from a fixed part of the environment. Each of these images is said to be a LDR image because there the maximum dynamic range cannot exceed a certain value for any particular camera array. On one extreme, the maximum level of light is limited by saturation level of the array, and the minimum level of light is limited by camera noise, especially camera electrical noise. As camera exposure time is increased, some areas of the array begin to saturate but the levels of other areas of the array that previously had very low levels are now somewhat higher. By collecting multiple LDR images, each having a different exposure time, each small area of the environment may be captured with an appropriate level of illumination and exposure. From the sequence of differently exposed LDR images, an image with a high dynamic range (HDR) is generated, preferably in the control and evaluation unit 22 or in a suitable processing unit of the color camera 25. The HDR image is then processed further. In this way, regions on the array that are completely dark or completely bright are avoided.

Compared to LDR images that may be encoded with only 8 bits per color channel, the HDR image has many more bits, for example, 32 bits per color channel. Often, the values of the color channels for each pixel of the HDR image are represented by floating point numbers instead of integer numbers. Then, the values may be in similar ranges as the values of LDR images, but having finer gradations. If enough storage capacity is available, the complete sequence of LDR images, in addition to the resulting HDR image, may be retained to preserve full information.

The HDR image can be visualized even if the hardware can display only LDR images, for example on the display unit 24. The process of mapping one set of colors to another to approximate the appearance of high dynamic range images in a medium that has a more limited dynamic range (for example, the display unit 24) is referred to as tone mapping. A slide or a dynamic compression (tone mapping) can be used for this purpose. With dynamic compression, the dynamic range of the HDR image is reduced to an LDR image by use of operators, particularly global operators, local operators, frequency-based operators or gradient-based operators. Bright surfaces appear darker, and dark surfaces appear brighter. With the local operators, a maximum visibility of details is obtained, independently of the illumination situation. In an embodiment, a fluent linking of local operators is constructed to generate continuous transitions without edges. The required storage capacity may be reduced by applying tone mapping to enable keeping, not the entire sequence of LDR images, but instead a single, already processed image obtained using the methods described hereinabove.

The resulting LDR image can then be used to color the gray-scale scan. Alternatively, the HDR image may be used to color the gray-scale scan, taking advantage of the finely graduated brightness levels to provide more object detail to enable more precise localization of objects. Furthermore, a HDR image may be used to enable display of an image in a preferred manner.

When capturing images, a dynamically determined average brightness level can be taken into account, so that the required number of images to be shot can be limited. When capturing the images, it can be determined whether there are bright areas or dark areas that for which image details are not being extracted (because of overexposure or underexposure). Such a determination may be made, for example, based on brightness statistics. When a threshold value (which may be based at least in part on brightness statistics) is exceeded, capturing of further images can be stopped without a loss of quality, minimizing the required time. Depending on the user settings, quality can be traded off against speed. To take full advantage of the collected information, HDR image and LDR image may be saved.

As explained hereinabove, the number of images can be reduced without sacrificing quality by observing averaged brightness values at pixels of the photosensitive array. The use of the term brightness in this context is understood to be related to a number of electrons created in R, B, G pixel wells in relation to the maximum number of electrons that the well will hold. This number of electrons is proportional to an optical power level passing from an object point through the camera lens before reaching an R, G, or B pixel in the camera photosensitive array. The pixel has a certain responsivity by which the integrated optical power is converted into a number of electrons in the pixel well. The electrons are extracted as an electrical current and converted into a voltage that is sampled with an analog-to-digital converter to provide a voltage. The voltage level for any particular pixel depends at least in part on (1) the optical power at the R, G, or B wavelength reflected from the object point into a corresponding pixel in the array, (2) the camera exposure time, and (3) responsivity of the pixel for the particular wavelength of light (R, G, or B).

Averaging of the brightness values can take place over a rotation of the mirror 16. The averaged brightness values are then used to determine the different exposure times for the sequence of LDR images. The number of LDR images in the sequence may depend on the camera FOV and camera aperture.

The rotation of the mirror 16 for averaging of the brightness values may be a rotation about the horizontal axis of the mirror 16. However, this rotation may also be a rotation of the entire measuring head 12 about its vertical axis, thus also resulting in a rotation of the mirror 16. During the rotation of the measuring head 12, the mirror 16 may be still with respect to the measuring head 12 (for example, by having no rotation of the mirror 16 around the horizontal axis). In this case, the mirror 16 may for example be aimed to the horizon (which defines the horizontal position of the mirror 16). Alternatively, the mirror 16 may be rotated about its horizontal axis in any of a number of different patterns, which might be full or partial rotations of the mirror. In an embodiment, the mirror 16 performs an oscillatory rotation around the horizontal position, for example, between the angles of −30° and +30° (a “waggling mirror”). Vertical and horizontal rotations may be used separately or combined to obtain an average brightness value to use in determining the number of LDR images to collect and the exposure time for each.

Alternatively, from an extra rotation (or other movement) of the mirror 16, the averaging of the brightness values may also be determined from any previous image taken by the color camera 25. The averaging of the brightness values may result in a single averaged brightness value used as median for defining the exposure times.

FIG. 4 shows a flowchart of the method of an embodiment of the present invention. In step 101, the laser scanner 10 generates a gray-scale scan over points X, each of the scan points obtained from several (e.g., 2000) samples of the propagation time of emission light beam 18 and reception light beam 20. In step 102, the color camera 25 captures a sequence of LDR color images with different exposure times. The order of step 101 and step 102 may be changed. In step 103, first, an HDR image is generated from the sequence of LDR images with different exposure times, and afterwards, a single LDR image is generated from the HDR image. Tone mapping is used, either to convert the HDR image still comprising the whole sequence of LDR images into the single LDR image, or to convert the sequence of LDR images into a processed HDR image. In step 104, the single LDR image (or the processed HDR image) is used to color the gray-scale scan into a color scan.

Connection between the laser scanner and, where appropriate, parts of the control and evaluation unit which are arranged outside the measuring head, and, where appropriate, a display unit on a computer which is connected to the laser scanner, and further computers which are incorporated in the system, can be carried out by wire or wireless, for example by means of WLAN.

While the invention has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, the invention can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the invention. Additionally, while various embodiments of the invention have been described, it is to be understood that aspects of the invention may include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description, but is only limited by the scope of the appended claims.

Claims

1. A method for optically scanning and measuring an object with a laser scanner, the method comprising:

providing the laser scanner having integral components that include a light emitter, a light receiver, a first angle measuring device, a second angle measuring device, a control and evaluation unit, and a color camera;
providing a color display;
measuring a first angle with the first angle measuring device;
measuring a second angle with the second angle measuring device;
emitting with the light emitter an emission light beam;
reflecting the emission light beam from the object to produce a reception light beam;
receiving with the light receiver the reception light beam and obtaining a first electrical signal in response;
determining with the control and evaluation unit distances to a plurality of measuring points on the object based at least in part on the first electrical signals for each of the plurality of measuring points and on a speed of light in air;
determining with the control and evaluation unit gray-scale values for the plurality of measuring points;
capturing with the color camera a sequence of color images while the color camera is fixed in space, each image of the sequence captured with a different exposure time and having an associated first dynamic range, the color images providing second electrical signals in response;
determining with the control and evaluation unit a three-dimensional (3D) gray-scale image based at least in part on the first angle, the second angle, the distances to and gray-scale values for the plurality of measuring points on the object;
determining with the control and evaluation unit an enhanced color image having an enhanced dynamic range, the enhanced dynamic range being higher than the any of the associated first dynamic ranges, the enhanced color image based at least in part on the second electrical signals;
determining with the control and evaluation unit an enhanced 3D color image by superimposing the enhanced color image on the 3D gray-scale image; and
displaying the enhanced 3D color image on the color display.

2. The method of claim 1, further including a step of applying dynamic compression to the enhanced color image to obtain a reduced dynamic range selected to match properties of the color display.

3. The method of claim 1, wherein in the step of providing the laser scanner, the laser scanner further includes a rotatable mirror, the rotatable mirror configured to rotate about the first angle.

4. The method of claim 3, wherein in the step of providing the laser scanner, the rotatable mirror reflects light from the object into the color camera.

5. The method of claim 4, further including a step of determining an average value of second electrical signals over a plurality of the first angles obtained in response to rotation of the rotatable mirror.

6. The method of claim 5, wherein, in the step of capturing with the color camera a sequence of color images, the sequence of color images has a number of color images in the sequence, the number of color images based at least in part on the average value of the second electrical signals.

7. The method of claim 6, wherein, in the step of capturing with the color camera a sequence of color images, the different exposure time of each image of the sequence is based at least in part on the average value of the second electrical signals.

Patent History
Publication number: 20150085079
Type: Application
Filed: Sep 24, 2014
Publication Date: Mar 26, 2015
Inventors: Jurgen Gittinger (Ludwigsburg), Martin Ossig (Tamm)
Application Number: 14/494,639
Classifications
Current U.S. Class: Picture Signal Generator (348/46)
International Classification: H04N 13/02 (20060101); H04N 13/04 (20060101);