Methods and Systems for LIDAR Optics Alignment
A method is provided that involves mounting a transmit block and a receive block in a LIDAR device to provide a relative position between the transmit block and the receive block. The method also involves locating a camera at a given position at which the camera can image light beams emitted by the transmit block and can image the receive block. The method also involves obtaining, using the camera, a first image indicative of light source positions of one or more light sources in the transmit block and a second image indicative of detector positions of one or more detectors in the receive block. The method also involves determining at least one offset based on the first image and the second image. The method also involves adjusting the relative position between the transmit block and the receive block based at least in part on the at least one offset.
This application is a continuation of U.S. patent application Ser. No. 17/061,172 filed on Oct. 1, 2020, which is a continuation of U.S. patent application Ser. No. 15/876,129 filed on Jan. 20, 2018 and issued as U.S. Pat. No. 10,816,648 on Oct. 27, 2020, which is a continuation of U.S. patent application Ser. No. 15/482,722 filed on Apr. 8, 2017 and issued as U.S. Pat. No. 9,910,139 on Mar. 6, 2018, which is a continuation of U.S. patent application Ser. No. 14/671,085 filed on Mar. 27, 2015 and issued as U.S. Pat. No. 9,651,658 on May 16, 2017, the entire contents of each of which are incorporated herein by reference.
BACKGROUNDUnless otherwise indicated herein, the materials described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.
A LIDAR can estimate distance to environmental features while scanning through a scene to assemble a “point cloud” indicative of reflective surfaces in the environment. Individual points in the point cloud can be determined by transmitting a laser pulse and detecting a returning pulse, if any, reflected from an object in the environment, and determining the distance to the object according to the time delay between the transmission of the transmitted pulse and the reception of the reflected pulse. A laser, or set of lasers, can be rapidly and repeatedly scanned across a scene to provide continuous real-time information on distances to reflective objects in the scene. Combining the measured distances and the orientation of the laser(s) while measuring each distance allows for associating a three-dimensional position with each returning pulse. In this way, a three-dimensional map of points indicative of locations of reflective features in the environment can be generated for the entire scanning zone.
SUMMARYIn one example, a method is provided that involves mounting a transmit block and a receive block in a light detection and ranging (LIDAR) device to provide a relative position between the transmit block and the receive block. The transmit block may include one or more light sources configured to emit light at a source wavelength. The receive block may include one or more detectors configured to detect light at the source wavelength. The method further involves locating a camera at a given position at which the camera, when focused at infinity, can image light beams emitted by the one or more light sources and can image the one or more detectors. The method further involves obtaining a first image using the camera located at the given position and focused at infinity. The first image may be indicative of light source positions of the one or more light sources. The method further involves obtaining a second image using the camera located at the given position and focused at infinity. The second image may be indicative of detector positions of the one or more detectors in the receive block. The method further involves determining at least one offset based on the light source positions indicated by the first image and the detector positions indicated by the second image. The method further involves adjusting the relative position between the transmit block and the receive block based at least in part on the at least one offset.
In another example, a system is provided that includes a mounting platform to mount a light detection and ranging (LIDAR) device that provides a relative position between a transmit block in the LIDAR device and a receive block in the LIDAR device. The transmit block may include one or more light sources configured to emit light at a source wavelength. The receive block may include one or more detectors configured to detect light at the source wavelength. The system also includes a camera located at a given position at which the camera, when focused at infinity, can image light beams emitted by the one or more light sources and can image the one or more detectors. The system also includes an alignment apparatus configured to adjust the relative position between the transmit block and the receive block. The system also includes a controller configured to obtain a first image from the camera located at the given position and focused at infinity. The first image may be indicative of light source positions of the one or more light sources. The controller is also configured to obtain a second image from the camera located at the given position and focused at infinity. The second image may be indicative of detector positions of the one or more detectors in the receive block. The controller is also configured to determine at least one offset based on the light source positions indicated by the first image and the detector positions indicated by the second image. The controller is also configured to cause the alignment apparatus to adjust the relative position between the transmit block and the receive block based at least in part on the at least one offset.
In yet another example, a system is provided that includes a means for mounting a transmit block and a receive block in a light detection and ranging (LIDAR) device to provide a relative position between the transmit block and the receive block. The transmit block may include one or more light sources configured to emit light at a source wavelength. The receive block may include one or more detectors configured to detect light at the source wavelength. The system also comprises means for locating a camera at a given position at which the camera, when focused at infinity, can image light beams emitted by the one or more light sources and can image the one or more detectors. The system also comprises means for obtaining a first image using the camera located at the given position and focused at infinity. The first image may be indicative of light source positions of the one or more light sources. The system also comprises means for obtaining a second image using the camera located at the given position and focused at infinity. The second image may be indicative of detector positions of the one or more detectors in the receive block. The system also comprises means for determining at least one offset based on the light source positions indicated by the first image and the detector positions indicated by the second image. The system also comprises means for adjusting the relative position between the transmit block and the receive block based at least in part on the at least one offset.
These as well as other aspects, advantages, and alternatives, will become apparent to those of ordinary skill in the art by reading the following detailed description, with reference where appropriate to the accompanying figures.
The following detailed description describes various features and functions of the disclosed systems, devices and methods with reference to the accompanying figures. In the figures, similar symbols identify similar components, unless context dictates otherwise. The illustrative system, device and method embodiments described herein are not meant to be limiting. It may be readily understood by those skilled in the art that certain aspects of the disclosed systems, devices and methods can be arranged and combined in a wide variety of different configurations, all of which are contemplated herein.
Within examples, a LIDAR device may include a transmit block and a receive block. The transmit block may include one or more light sources that transmit light for propagation away from the LIDAR device toward an environment of the LIDAR device. In turn, the transmitted light may reflect off one or more objects in the environment, and the reflected light may propagate back toward the LIDAR device. Further, the receive block may include one or more detectors to detect the reflected light. Through this process, a computing system may process data from the LIDAR device pertaining to the reflected light to determine positions and/or characteristics of various objects in the environment of the LIDAR device.
To facilitate operation of the LIDAR device, a light beam emitted by a given light source and reflected back toward the LIDAR device is received by a corresponding detector. Within examples, systems and methods are provided for alignment of light source(s) and detector(s) in a LIDAR device.
The mounting platform 102 may provide a platform for mounting some or all of the components of the system 100. As shown, the mounting platform 102 mounts a LIDAR device 104 and a camera 106. In some examples, the mounting platform 102 may also mount a light filter 108. Further, in some examples, the mounting platform 102 may also mount an actuator 112. Accordingly, the mounting platform 102 may be formed from one or more solid materials suitable for supporting the various components, such as plastics or metals among other possibilities. In some examples, some of the components shown to be mounted on the mounting platform 102 may alternatively be mounted to a separate structure (not shown) or otherwise coupled to the system 100. For instance, the camera 106 and/or the light filter 108 may be alternatively positioned and/or mounted outside the mounting platform 102.
The LIDAR device 104 includes a transmit block 120 and a receive block 130. As shown, in some examples, the LIDAR device 104 may optionally include a lens 150.
The transmit block 120 includes one or more light sources 122 that may be configured to emit one or more light beams 124. Although not shown in
The receive block 130 includes one or more detectors 132 that may be configured to receive light from an environment of the LIDAR device 104. In one example, a given detector of the detectors 132 is configured and arranged to receive a given light beam of the light beams 124 that is reflected off an object in the environment of the LIDAR device 104 toward the given detector. Through this process, for example, the LIDAR device 104 may detect various objects in the environment by emitting light beams 124 using the light sources 122 and detecting reflections of the light beams 124 using the detectors 132. Although not shown, in some examples, the receive block 130 may include additional components such as an inert gas, an entrance aperture (e.g., half-mirror), and/or any other component to filter and/or condition light propagating toward the detectors 132. The detector(s) 132 may comprise photodiodes, avalanche photodiodes, phototransistors, cameras, active pixel sensors (APS), charge coupled devices (CCD), cryogenic detectors, or any other sensor of light. In one embodiment, the detector(s) 132 may be configured to detect light at the source wavelength (e.g., 905 nm, etc.) of the light beams 124 emitted by the light sources 122 and/or reflections thereof.
The lens 150 may be optionally included in the LIDAR device 104 and may be configured to collimate the emitted light beams 124 and/or focus light propagating toward the detectors 132. In one embodiment, the lens 150 may be a single lens having an optical power to both collimate the light beams 124 and focus light onto the detectors 132. In another embodiment, the lens 150 may include two separate lenses. For example, a first lens may collimate the light beam(s) 124 emitted by the light source(s) 122, and a second lens may focus light propagating toward the LIDAR device 104 onto the detector(s) 132. Other lens configurations are possible as well (e.g., multiple lenses for collimation and/or multiple lenses for focus, etc.).
In some examples, the LIDAR device 104 may include additional, fewer, or different components than those shown in
In some examples, the various components of the LIDAR device 104 such as the transmit block 120, receive block 130, and the lens 150 can be removably mounted in predetermined positions within the LIDAR device 104 to reduce burden of calibrating the arrangement of each component and/or subcomponents included in each component. In these examples, the system 100 may adjust the relative position between the transmit block 120 and the receive block 130 to align the one or more light sources 122 with the one or more detectors 132. Alternatively, in other examples, the system 100 may be configured to adjust the relative position between each of the subcomponents (e.g., each light source of light sources 122, each detector of detectors 132, etc.).
The camera 106 may be any camera (e.g., a still camera, a video camera, etc.) configured to capture images of the LIDAR device 104. In some examples, the camera 106 may be located at a given position at which the camera 106 can image the light beams 124 emitted by the one or more light sources 122, and can image the one or more detectors 132. In one embodiment, the camera may be focused at infinity when capturing such images. By way of example, the camera may be mounted to have a field-of-view along the path of the light beams 124 (e.g., facing the lens 150, etc.).
As an example scenario for operation of the camera 106, the LIDAR device 104 may then be configured to cause the light sources 122 to emit the light beams 124 toward the camera 106. In turn, the camera 106 may provide a first image of the light beams 124. The first image, for instance, may indicate light source position(s) (e.g., bright pixels in the first image, etc.) of the light source(s) 122. In the scenario, the camera may also obtain a second image indicative of detector position(s) of the detector(s) 132. Other scenarios are possible as well and are described in greater detail within exemplary embodiments herein.
The light filter 108 may be optionally included in the system 100 to facilitate capture and/or processing of the images described above. For instance, the light filter 108 may be positioned along a path of the light beams 124 between the LIDAR device 104 and the camera 106. In one example, the light filter 108 may be configured to attenuate light within a wavelength range that includes the source wavelength of the light source(s) 122. In this example, the attenuation of the light may facilitate contrasting pixels in the first image that are associated with the light beams 124 against surrounding pixels. Further, in this example, the attenuation of the light may protect the camera 106 from the intensity of the light beams 124. In another example, the light filter 108 may be configured to attenuate light within another wavelength range that does not include the source wavelength of the light source(s) 122. In this example, the images obtained by the camera 106 may represent features of interest (e.g., light source(s) 122, detector(s) 132, etc.) in the LIDAR device 104 more clearly due to the light filter 108 attenuating background light having other wavelengths. Other configurations of the light filter 108 are possible as well and are described in greater detail within exemplary embodiments herein.
The actuator 112 may be optionally included in the system 100. The actuator 112 may be configured to adjust the position of the light filter 108. For instance, the actuator 112 may be configured to actuate the light filter 108 to a first position where the light filter 108 is interposed between the LIDAR device 104 and the camera 106, or to a second position where the light filter 108 is not interposed between the LIDAR device 104 and the camera 106. Example actuators may include motors, stepper motors, pneumatic actuators, hydraulic pistons, relays, solenoids, and piezoelectric actuators among other possibilities.
The alignment apparatus 160 may include any device that couples to one or more of the components in the LIDAR device 104 to adjust the relative position between the transmit block 120 and the receive block 130. By way of example, the alignment apparatus 160 may be a robotic arm that physically couples to the receive block 130 to rotate and/or translate the position of the receive block 130 in the LIDAR 104. Alternatively or additionally, for example, the robotic arm may adjust the position of the transmit block 120. In some examples, the alignment apparatus 160 may adjust the relative position between the transmit block 120 and the receive block 130 based on image(s) obtained by the camera 106. For instance, the alignment apparatus 160 may adjust the relative position to align one or more of the light beams 124 emitted by the light sources 122 with one or more of the detectors 132.
The system 100 may optionally include an auxiliary light source 170 that emits light 174 at the source wavelength to illuminate the receive block 130. The structure and form of the auxiliary light source 170 (e.g., LED, etc.) may be similar to the light sources 122. In one example, where the camera 106 is configured to capture images when focused at infinity for the source wavelength the light sources 122, the auxiliary light source 170 may illuminate the detectors 132 to facilitate the camera 106 obtaining the second image of the detectors 132 while also focused at infinity for the same source wavelength.
In some examples, the light filter 108 may be configured to remain interposed between the camera 106 and the LIDAR device 104 during capture of the first image (e.g., of the light beams 124) and during capture of the second image (e.g., of the detectors 132). In these examples, the illuminating light 174 from the auxiliary light source 170 may allow the camera 106 to capture an image of the detectors 132 while the light filter 108 is interposed.
In other examples, the system 100 may be configured to move the light filter 108 to another position other than the position between the camera 106 and the LIDAR device 104 prior to the camera 106 obtaining the second image of the detectors 132. For instance, the light filter 108 may be moved by the actuator 112. In these examples, the camera 106 may rely on background light to obtain the second image of the detectors 132, or the system 100 may utilize the auxiliary light source 170 to illuminate the receive block 130.
The controller 180 may include one or more processors configured to operate some or all of the components of the system 100 in line with the discussion above. To that end, the controller 180 may be coupled to the various components via a wired or wireless interface (not shown). In some examples, the controller 180 may execute program functions stored in a computer readable medium (not shown) to cause the system 100 to perform various functions and processes of the present method.
In a first example, the controller 180 may cause a power source (not shown) to provide power to the various components of the system 100. In a second example, the controller 180 may cause the transmit block 120 of the LIDAR device 104 to emit the light beams 124. In a third example, the controller 180 may operate an actuator (not shown) to position the light filter 108 between the LIDAR device 104 and the camera 106, or to position the light filter 108 at any other position. In a third example, the controller 180 may operate the camera 106 to obtain the first image (e.g., of the light beams 124) and the second image (e.g., of the detectors 132) in line with the discussion above. In a fourth example, the controller 180 may operate the alignment apparatus 160 to adjust the relative position between the transmit block 120 and the receive block 130. In a fifth example, the controller 180 may operate the alignment apparatus 160 to mount (or unmounts) various components (e.g., LIDAR device 104, etc.) to the mounting platform 102. In a sixth example, the controller 180 may operate the auxiliary light source 170 to illuminate the receive block 130 with light 174. In a seventh example, the controller 180 may operate the actuator 112 to move the light filter 108. Other examples are possible as well and are described in greater detail within exemplary embodiments herein.
The system 100 may include additional, fewer, or different components than those shown, and may perform other functions as well. In one example, the system 100 may include a display (not shown) for displaying image(s) obtained using the camera 106. For instance, the display may have a graphical user interface (GUI) for displaying and/or interacting with images captured by the camera 106, and a human operator or a computer operator may interact with the GUI to adjust the relative position between the transmit block 120 and the receive block 130 by manipulating the images in the GUI. Other procedures are possible as well for controlling the system 100 in accordance with the present disclosure.
As shown, the LIDAR device 200 includes a housing 210 and a lens 250. Additionally, light beams 204 emitted by the first LIDAR device 200 propagate from the lens 250 along a viewing direction of the first LIDAR 200 toward an environment of the LIDAR device 200, and reflect off one or more objects in the environment as reflected light 206.
The housing 210 included in the LIDAR device 200 can provide a platform for mounting the various components included in the LIDAR device 200. The housing 210 can be formed from any material capable of supporting the various components of the LIDAR device 200 included in an interior space of the housing 210. For example, the housing 210 may be formed from a solid material such as plastic or metal among other possibilities.
In some examples, the housing 210 can be configured to have a substantially cylindrical shape and to rotate about an axis of the LIDAR device 200. For example, the housing 210 can have the substantially cylindrical shape with a diameter of approximately 10 centimeters. In some examples, the axis is substantially vertical. By rotating the housing 210 that includes the various components, in some examples, a three-dimensional map of a 360-degree view of the environment of the LIDAR device 200 can be determined without frequent recalibration of the arrangement of the various components of the LIDAR device 200. Additionally or alternatively, in some examples, the LIDAR device 200 can be configured to tilt the axis of rotation of the housing 210 to control the field of view of the LIDAR device 200.
The lens 250 mounted to the housing 210 can have an optical power to both collimate the emitted light beams 204, and focus the reflected light 205 from one or more objects in the environment of the LIDAR device 200 onto detectors in the LIDAR device 200. In one example, the lens 250 has a focal length of approximately 120 mm. By using the same lens 250 to perform both of these functions, instead of a transmit lens for collimating and a receive lens for focusing, advantages with respect to size, cost, and/or complexity can be provided.
The LIDAR device 200 can be mounted on a mounting structure 260 that rotates about an axis to provide a 360-degree view of the environment surrounding the LIDAR device 200. In some examples, the mounting structure 260 may comprise a movable platform that may tilt in one or more directions to change the axis of rotation of the LIDAR device 200.
The transmit block 220 includes a plurality of light sources 222a-c arranged along a curved focal surface 228 defined by the lens 250. The plurality of light sources 222a-c can be configured to emit, respectively, the plurality of light beams 202a-c having wavelengths within a wavelength range. For example, the plurality of light sources 222a-c may comprise laser diodes that emit the plurality of light beams 202a-c having the wavelengths within the wavelength range. The plurality of light beams 202a-c are reflected by mirror 224 through an exit aperture 226 into the shared space 240 and towards the lens 250.
The light sources 222a-c can include laser diodes, light emitting diodes (LED), vertical cavity surface emitting lasers (VCSEL), organic light emitting diodes (OLED), polymer light emitting diodes (PLED), light emitting polymers (LEP), liquid crystal displays (LCD), microelectromechanical systems (MEMS), or any other device configured to selectively transmit, reflect, and/or emit light to provide the plurality of emitted light beams 202a-c. In some examples, the light sources 222a-c can be configured to emit the emitted light beams 202a-c in a wavelength range that can be detected by detectors 232a-c included in the receive block 230. The wavelength range could, for example, be in the ultraviolet, visible, and/or infrared portions of the electromagnetic spectrum. In some examples, the wavelength range can be a narrow wavelength range, such as provided by lasers. In one embodiment, the wavelength range includes a source wavelength of 905 nm. Additionally, the light sources 222a-c can be configured to emit the emitted light beams 202a-c in the form of pulses. In some examples, the plurality of light sources 222a-c can be disposed on one or more substrates (e.g., printed circuit boards (PCB), flexible PCBs, etc.) and arranged to emit the plurality of light beams 202a-c towards the exit aperture 226.
Although
Due to the arrangement of the plurality of light sources 222a-c along the curved focal surface 228, the plurality of light beams 202a-c, in some examples, may converge towards the exit aperture 226. Thus, in these examples, the exit aperture 226 may be minimally sized while being capable of accommodating vertical and horizontal extents of the plurality of light beams 202a-c. Additionally, in some examples, the curved focal surface 228 can be defined by the lens 250. For example, the curved focal surface 228 may correspond to a focal surface of the lens 250 due to shape and composition of the lens 250. In this example, the plurality of light sources 222a-c can be arranged along the focal surface defined by the lens 250 at the transmit block.
The plurality of light beams 202a-c propagate in a transmit path that extends through the transmit block 220, the exit aperture 226, and the shared space 240 towards the lens 250. The lens 250 collimates the plurality of light beams 202a-c to provide collimated light beams 204a-c into an environment of the LIDAR device 200. The collimated light beams 204a-c correspond, respectively, to the plurality of light beams 202a-c. In some examples, the collimated light beams 204a-c reflect off one or more objects in the environment of the LIDAR device 200 as reflected light 206. The reflected light 206 may be focused by the lens 250 into the shared space 240 as focused light 208 traveling along a receive path that extends through the shared space 240 onto the receive block 230. For example, the focused light 208 may be reflected by the reflective surface 242 as focused light 208a-c propagating towards the receive block 230.
The lens 250 may be capable of both collimating the plurality of light beams 202a-c and focusing the reflected light 206 along the receive path 208 towards the receive block 230 due to the shape and composition of the lens 250. For example, the lens 250 can have an aspheric surface 252 facing outside of the housing 210 and a toroidal surface 254 facing the shared space 240. By using the same lens 250 to perform both of these functions, instead of a transmit lens for collimating and a receive lens for focusing, advantages with respect to size, cost, and/or complexity can be provided.
The exit aperture 226 is included in a wall 244 that separates the transmit block 220 from the shared space 240. In some examples, the wall 244 can be formed from a transparent material (e.g., glass) that is coated with a reflective material 242. In this example, the exit aperture 226 may correspond to the portion of the wall 244 that is not coated by the reflective material 242. Additionally or alternatively, the exit aperture 226 may comprise a hole or cut-away in the wall 244.
The focused light 208 is reflected by the reflective surface 242 and directed towards an entrance aperture 234 of the receive block 230. In some examples, the entrance aperture 234 may comprise a filtering window configured to allow wavelengths in the wavelength range of the plurality of light beams 202a-c (e.g., source wavelength) emitted by the plurality of light sources 222a-c and attenuate other wavelengths. In some examples, the entrance aperture 234 may comprise a half-mirror configured to reflect a portion of the focused light 208a-c and allow another portion of the focused light 208a-c to propagate toward the detectors 232a-c. The focused light 208a-c reflected by the reflective surface 242 from the focused light 208a-c propagates, respectively, onto a plurality of detectors 232a-c.
The plurality of detectors 232a-c can be arranged along a curved focal surface 238 of the receive block 230. Although
The detectors 232a-c may comprise photodiodes, avalanche photodiodes, phototransistors, cameras, active pixel sensors (APS), charge coupled devices (CCD), cryogenic detectors, or any other sensor of light configured to receive focused light 208a-c having wavelengths in the wavelength range of the emitted light beams 202a-c (e.g., the source wavelength).
Each of the focused light 208a-c corresponds, respectively, to the emitted light beams 202a-c and is directed onto, respectively, the plurality of detectors 232a-c. For example, the detector 232a is configured and arranged to received focused light 208a that corresponds to collimated light beam 204a reflected of the one or more objects in the environment of the LIDAR device 200. In this example, the collimated light beam 204a corresponds to the light beam 202a emitted by the light source 222a. Thus, the detector 232a receives light that was emitted by the light source 222a, the detector 232b receives light that was emitted by the light source 222b, and the detector 232c receives light that was emitted by the light source 222c.
By comparing the received light 208a-c with the emitted light beams 202a-c, at least one aspect of the one or more object in the environment of the LIDAR device 200 may be determined. For example, by comparing a time when the plurality of light beams 202a-c were emitted by the plurality of light sources 222a-c and a time when the plurality of detectors 232a-c received the focused light 208a-c, a distance between the LIDAR device 200 and the one or more object in the environment of the LIDAR device 200 may be determined. In some examples, other aspects such as shape, color, material, etc. may also be determined.
In some examples, the LIDAR device 200 may be rotated about an axis to determine a three-dimensional map of the surroundings of the LIDAR device 200. For example, the LIDAR device 200 may be rotated about a substantially vertical axis as illustrated by arrow 290. Although illustrated that the LIDAR device 200 is rotated counter clock-wise about the axis as illustrated by the arrow 290, additionally or alternatively, the LIDAR device 200 may be rotated in the clockwise direction. In some examples, the LIDAR device 200 may be rotated 360 degrees about the axis. In other examples, the LIDAR device 200 may be rotated back and forth along a portion of the 360 degree view of the LIDAR device 200. For example, the LIDAR device 200 may be mounted on a platform that wobbles back and forth about the axis without making a complete rotation.
Thus, the arrangement of the light sources 222a-c and the detectors 232a-c may allow the LIDAR device 200 to have a particular vertical field-of-view. In one example, the vertical FOV of the LIDAR device 200 is 20°. Additionally, the rotation of the LIDAR device 200 allows the LIDAR device 200 to have a 360° horizontal FOV Further, the rate of rotation may allow the device to have a particular refresh rate. In one example, the refresh rate is 10 Hz. The refresh rate along with the arrangement of the light sources 222a-c and the detectors 232a-c may also allow the LIDAR device 300 to have a particular angular resolution. In one example, the angular resolution is 0.2°×0.3°. However, the various parameters such as the refresh rate and the angular resolution may vary according to the configuration of the LIDAR device 200. Further, in some examples, the LIDAR device 200 may include additional, fewer, or different components than those shown in
Although the plurality of light sources 322a-c can be arranged along a focal surface 328 that is curved in a vertical plane, additionally or alternatively, the plurality of light sources 322a-c can be arranged along a focal surface that is curved in a horizontal plane or a focal surface that is curved both vertically and horizontally. For example, the plurality of light sources 322a-c can be arranged in a curved three dimensional grid pattern. For example, the transmit block 320 may comprise a plurality of printed circuit boards (PCBs) vertically mounted such that a column of light sources such as the plurality of light sources 322a-c are along the vertical axis of each PCB and each of the plurality of PCBs can be arranged adjacent to other vertically mounted PCBs along a horizontally curved plane to provide the three dimensional grid pattern. Alternatively, in some examples, the light sources 322a-c may be arranged along any other surface such as a linear surface. Further, although the transmit block 320 is shown to include multiple light sources 322a-c, in some examples, the transmit block 320 may include only one light source or a different number of light sources than those shown in
As shown in
The light emitted by the light sources 222a-c may be partially collimated to fit through the exit aperture 224.
In one specific example, laser diode 402 is an Osram SPL DL90_3 nanostack pulsed laser diode that emits pulses of light with a range of wavelengths from about 896 nm to about 910 nm (a nominal wavelength of 905 nm). In this specific example, the aperture has a shorter dimension of about 10 microns, corresponding to its fast axis, and a longer dimension of about 200 microns, corresponding to its slow axis. The divergence of the laser beam in this specific example is about 25 degrees along the fast axis and about 11 degrees along the slow axis. It is to be understood that this specific example is illustrative only. Laser diode 402 could have a different configuration, different aperture sizes, different beam divergences, and/or emit different wavelengths.
As shown in
In one example, cylindrical lens 404 is a microrod lens with a diameter of about 600 microns that is placed about 250 microns in front of aperture 406. The material of the microrod lens could be, for example, fused silica or a borosilicate crown glass, such as Schott BK7. Cylindrical lens 404 could also be used to provide magnification along fast axis 408. For example, if the dimensions of aperture 406 are 10 microns by 200 microns, as previously described, and cylindrical lens 404 is a microrod lens as described above, then cylindrical lens 404 may magnify the shorter dimension (corresponding to fast axis 408) by about 20 times. This magnification effectively stretches out the shorter dimension of aperture 406 to about the same as the longer dimension. As a result, when light from laser beam 416 is focused, for example, onto a detector, the focused spot could have a substantially square shape instead of the rectangular slit shape of aperture 406.
The receive block 530 comprises a flexible substrate 580 on which the plurality of detectors 532a-c are arranged along the curved surface 538. The flexible substrate 580 conforms to the curved surface 538 by being mounted to a receive block housing 590 having the curved surface 538. As illustrated in
In some embodiments, the number and arrangement of the detectors 532a-c may be different than those shown in
As noted above in the description of
As shown, the LIDAR device 600 includes an optics assembly 610, a transmit lens 652, a receive lens 654, a mirror 620, a pin 622, and a motor 670. For purposes of illustration,
In some examples, the LIDAR device 600 may emit light that propagates away from the mirror 660 along a viewing direction of the LIDAR device 600 (e.g., parallel to z-axis shown in
Accordingly, the optics assembly 610 may be configured to emit light pulses towards the mirror 660 that are then reflected by the mirror 660 towards the environment. Further, the optics assembly 610 may be configured to receive reflected light that is reflected off the mirror 660. In one embodiment, the optics assembly 610 may include a single laser emitter that is configured to provide a narrow beam having a wavelength of 905 nm. In other embodiments, the optics assembly 610 may include multiple light sources similarly to the LIDAR device 200 of
As shown, the mirror 660 may be arranged to steer emitted light from the transmit lens 652 towards the viewing direction of the LIDAR device 600. Further, for example, the mirror 660 may be arranged to steer reflected light from the mirror 660 towards the receive lens 654. In some examples, the mirror 6690 may be a triangular mirror that performs complete rotations about an axis defined by the pin 662. In one embodiment, the vertical FOV of the LIDAR device 600 is 110°.
The pin 662 may be configured to mount the mirror 660 to the LIDAR device 600. In turn, the pin 662 can be formed from any material capable of supporting the mirror 660. For example, the pin 662 may be formed from a solid material such as plastic or metal among other possibilities. In some examples, the LIDAR device 600 may be configured to rotate the mirror 660 about the pin 662 for complete rotations to steer emitted light from the optics assembly 610 vertically. In other examples, the LIDAR device 600 may be configured to rotate the mirror 660 about the pin 662 over a given range of angles to steer the emitted light. Thus, in some examples, various vertical FOVs are possible by adjusting the rotation the mirror 660 about the pin 662.
The motor 670 may include any motor such as a stepper motor, an electric motor, a combustion motor, a pancake motor, and/or a piezoelectric actuator among other possibilities. In some examples, the motor 670 may be configured to rotate various components of the LIDAR device 600 (e.g., optics assembly 610, mirror 660, pin 662, etc.) about an axis of the LIDAR device 600. For example, the axis may be substantially vertical similarly to the y-axis shown in
As shown, the optics assembly 610 comprises a transmit block 620 that includes one light source 622. In some examples, the transmit block 620 may alternatively include more than one light source similarly to the transmit block 220 of the LIDAR device 200. However, for the sake of example, the transmit block 620 includes one light source 622. The light source 622 may be configured to emit one or more light pulses (e.g., laser beams, etc.) towards the transmit lens 652. For example, as shown, emitted light 602a propagates away from the light source 622 towards the transmit lens 652. In some examples, the light source 622 may be similar to the light sources 222a-c of the LIDAR device 200 of
In line with the discussion above, the transmit lens 652 may be configured to collimate the emitted light 602a into one or more collimated light beams 602b and/or may be configured to focus the emitted light 602a as the focused light 602b onto the mirror 660.
As shown, the mirror 660 may be a triangular mirror that has three reflective surfaces 660a, 660b, 660c. However, in other examples, the mirror 660 may alternatively include a greater or fewer number of reflective surfaces. As shown, the collimated light 602b may then reflect off the reflective surface 602a and into the environment of the LIDAR 600 as emitted light 602c. For example, a direction of the emitted light 602c is illustrated by arrow 604. In practice, as the mirror 660 is rotated about an axis defined by the pin 662, the emitted light 602c may be steered to have a different direction than that illustrated by arrow 604. For example, the direction 604 of the emitted light 602c may instead correspond to a different direction along arrow 606. Thus, by rotating the mirror 660 about the pin 662, the LIDAR device 600 may be configured to have a vertical FOV, for example.
Consider by way of example a scenario where the mirror 660 is configured to rotate about an axis defined by the pin 662 continuously in a clock-wise direction. In this scenario, the direction 604 of the emitted light 602c may thereby be adjusted also in a clock-wise direction as illustrated by the arrow 606 until the focused light 602b is reflecting off an edge of the reflective surface 660a. At this point, the emitted light 602c would be directed towards a maximum extent of the vertical FOV of the LIDAR device 600. Continuing with the scenario, as the mirror 660 continues to rotate, the collimated light 602b may then be focused onto the reflective surface 660b instead of the reflective surface 660a. At this point, the reflected light 602c may be steered to a direction that is towards a minimum extent of the vertical FOV of the LIDAR device 600. Continuing with the scenario, as the mirror 660 continues to rotate, the direction of the emitted light 602c may then be adjusted in a clock-wise direction towards the maximum extent of the vertical FOV that corresponds to the light 602b being focused onto another edge of the reflective surface 660b. Similarly, continuing with the scenario, the direction of the emitted light 602c may then be adjusted to scan the vertical FOV of the LIDAR device 600 by reflecting the light 602b off the reflective surface 660c instead of the reflective surface 660b. Through this process, for example, the LIDAR device 600 may continuously scan the vertical FOV. As a variation of the scenario above by way of example, the mirror 660 may be alternatively configured to rotate within a given range of angles (e.g., wobble, etc.) to define a narrower vertical field-of-view than that of the scenario described above. Other configurations for rotation of the mirror 660 are possible as well.
As shown, the optical assembly 610 comprises transmit block 620 that includes one light source 622. The light source 622 may be similar to the light sources 222a-c of the LIDAR device 200. In line with the discussion above, in some examples, the transmit block 620 may alternatively include more than one light source. The light source 622 is configured to emit light 602a toward the transmit lens 652. Further, the optical assembly 610 comprises receive block 630 that includes one detector 632. The detector 632 may be similar to the detectors 232a-c of the LIDAR device 200. Again, in some examples, the receive block 630 may alternatively include more than one detector. The detector 632 may be configured to receive light 608 focused by the receive lens 654.
As shown, the optics assembly 610 also includes an optical shield 612. The optical shield 612 may be configured to provide optical shielding between the transmit block 620 and the receive block 630, at least for the light having the source wavelength of the emitted light 602a. In turn, the optical shield 612 may mitigate interference with light 608 detected by the detector 632 of the receive block 630. The optical shield 612 may be formed, for example, as a wall coated by a metal, metallic ink, or metallic foam to provide the shielding. Example metals may include copper or nickel. Other configurations and/or materials are possible as well for the optical shield 612.
As noted in the description of the system 100 of
The optical components (i.e., the transmit block 620 and the receive block 630) of the optical assembly 610 may each have six degrees of freedom (DOF). Three of the six DOF are translational: forward/backward (e.g., linearly along an axis of the optical component that is parallel to the y-axis shown in
In line with the discussion above, systems and methods herein allow optics alignment for LIDAR devices having various different configurations. Thus, the configurations of the LIDAR devices 200 and 600 are presented for exemplary purposes only. Other configurations and LIDAR devices are possible as well for the systems and methods herein.
The mounting structure 710 may be formed from any solid material (e.g., metal, plastic, etc.) similarly to the mounting platform 702, and may be shaped to facilitate coupling one or more of the components of the system 700 to the mounting platform 702. As shown, for example, the mounting structure 710 couples the camera 706 and the light filter 708 to the mounting platform 702. However, in some examples, a separate mounting structure may be utilized for each of the camera 706 and the light filter 708.
The actuator 712 may be configured to adjust a position of the light filter 708. Example actuators may include motors, stepper motors, pneumatic actuators, hydraulic pistons, relays, solenoids, and piezoelectric actuators among other possibilities.
The manipulator 762 may include any structure configured to couple the alignment apparatus 702 with one or more components of the LIDAR device 704. In line with the discussion above, the system 700 may adjust a relative position between a transmit block (not shown) and a receive block (not shown) of the LIDAR device 704. For instance, the alignment apparatus 760 (e.g., robotic arm, etc.) may adjust the relative position between the transmit block and the receive block by adjusting the position of the manipulator 762 or changing the orientation of the manipulator 762, among other possibilities.
As shown, the mounting platform 702 includes a plurality of holes, exemplified by holes 702a and 702b. In some embodiments, the various components such as the LIDAR device 704, the camera 706, and/or the light filter 708 may be mounted to the mounting platform by coupling the various components to such holes in the mounting platform (e.g., by fastening a bolt through the holes). In other embodiments, the various components may be mounted to the mounting platform 702 via other processes or devices. In one example, the various components may be mounted to the mounting platform 702 via an application of an adhesive among other possibilities. In another example, a mounting structure may couple one or more of the components to the mounting platform 702. For instance, as shown in
Further, as shown, the LIDAR device 704 has a configuration similar to the LIDAR device 200 of
An example scenario for operation of the system 700 is as follows. First, as shown, the LIDAR device 704 may be mounted to the mounting platform 702 to have a field-of-view (FOV) through which light emitted by a transmit block (not shown) of the LIDAR device 704 and light received by a receive block (not shown) of the LIDAR device 704 propagates. For instance, the LIDAR device 704 may be mounted to the mounting platform 702 using the alignment apparatus 760 (e.g., robotic arm, etc.) or any other device (e.g., mechanical tool, etc.). Next, the camera 706 may be located at a given position relative to the LIDAR device 704 such that the camera 706 may obtain images of the receive block (not shown) of the LIDAR device 704 and/or light emitted by the transmit block (not shown) of the LIDAR device 704.
Continuing with the scenario, the camera 706 may then be focused to infinity for the source wavelength (e.g., 905 nm) of the light emitted by the LIDAR device 704. Next, the light filter 708 may be positioned at a first position to be interposed between the LIDAR device 704 and the camera 706 along the FOV of the LIDAR device 704. For instance, the actuator 712 may be configured to move the light filter 708 to the first position shown in
Continuing with the scenario, the system 700 may then cause the transmit block (not shown) of the LIDAR device 704 to emit one or more light beams through the light filter 708 and toward the camera 706. Referring back to
Continuing with the scenario, the system may then cause the actuator 712 to move the light filter 708 to a second position where the light filter 708 is not interposed between the camera 706 and the LIDAR 704.
Continuing with the scenario, the system 700 may then determine at least one offset based on the first image and the second image. In one instance, the at least one offset may include distances between adjacent light sources and/or adjacent detectors of the LIDAR device 704. In another instance, the at least one offset may include distances between light beams emitted by light sources in the LIDAR device 704 and corresponding detectors of the LIDAR device 704. Other offsets are possible as well and are described in greater detail within exemplary embodiments herein.
In line with the discussion above, the alignment apparatus 760 (e.g., robotic arm, etc.) may couple to the transmit block (not shown) and/or the receive block (not shown) of the LIDAR device 704 via the manipulator 762. Thus, in the scenario, the alignment apparatus 760 may then adjust the relative position between the transmit block and the receive block in accordance with the determined at least one or more offset.
As shown in
Similarly to the description of the receive block 630 of
Continuing with the scenario, the system 700 may adjust the position of the receive block 730, when coupled to the manipulator 762, by adjusting the forward/backward position, the up/down position, the left/right position, the pitch, the yaw, and/or the roll of the receive block 730 in line with the discussion above. In turn, the system 700 may adjust the relative position between the transmit block (not shown) of the LIDAR 704 and the receive block 730. In some embodiments, additionally or alternatively to the adjustments to the position/orientation of the receive block 730, the manipulator 762 may adjust the position/orientation of the transmit block (not shown) of the LIDAR device 704 in a similar manner. Through this process, for example, the system 700 may align one or more light sources of the LIDAR device 704 with one or more detectors of the LIDAR device 704.
In some embodiments, the system 700 may then decouple the manipulator 762 from the receive block 730 (or the transmit block), and the receive block 730 may be configured to remain at the alignment (e.g., relative position) performed by the alignment apparatus 760. By way of example, the system 700 may apply an epoxy or other adhesive to a periphery of the receive block 730 to maintain the receive block 730 at the aligned relative position to the transmit block of the LIDAR device 704. In one instance, the manipulator 762 may remain coupled to the receive block 730 until the adhesive is cured. However, other processes are possible as well for maintaining the relative position between the transmit block and the receive block 730 of the LIDAR device 704. For instance, the receive block 730 may be fastened to a housing of the LIDAR device 704 using bolts, screws, or any other device among other possibilities.
It is noted that the scenario presented above is for exemplary purposes only. Other scenarios are possible as well that may include some or all of the components of the system 700, or may include other processes than those described. A non-exhaustive list of example variations of the scenario is presented below.
In a first example, the system 700 may not include the light filter 708. For instance, the camera 706 may be configured to obtain the first image and the second image without the light filter 708 being interposed between the camera 706 and the LIDAR device 704.
In a second example, the camera 706 and the light filter 708 may be mounted separately from the LIDAR device 704. For instance, the system 700 may be implemented in an assembly line where multiple LIDAR devices such as the LIDAR device 704 are located on an assembly belt. In this instance, the camera 706, the light filter 708 and the robotic arm 760 may be mounted independently adjacent to the assembly belt, and may perform similar functions to the scenario described above to calibrate each LIDAR device in the assembly line.
In a third example, the receive block 730 may be illuminated by an auxiliary light source (not shown) similar to the auxiliary light source 170. For instance, the light filter 708 may remain interposed between the LIDAR device 704 and the camera 706 while the first image of the laser beams emitted by the LIDAR device 704 is captured by the camera 706, and while the second image of the receive block 730 is captured as well. In this instance, the receive block 730 would be visible to the camera 706 through the light filter 708 due to the illumination by the auxiliary light source with light at the source wavelength that is viewable through the light filter 708.
In a fourth example, the LIDAR device 704 may be configured to continue emitting light while the camera 106 captures the first image and the second image. For instance, the light filter 708 may alternatively attenuate the light beams having the source wavelength emitted by light sources of the LIDAR device 704. Thus, in this instance, the light filter 708 may be positioned as shown in
In a fifth example, the camera 706 may be configured to capture a single image instead of the first image and the second image. The single image may be indicative of both the light source positions of light sources in the LIDAR device 704, and the detector positions of detectors in the receive block 730. Referring back to
As shown, the LIDAR device 804 has a configuration similar to the LIDAR device 600 of
As shown, the light filter 808 is interposed between the LIDAR device 804 and the camera 806. Additionally, the camera 806 is located at a given position at which the camera 806 can image light beam(s) emitted by the LIDAR device 804 and can image detector(s) (not shown) in the LIDAR device 804. Referring back to
However, as shown, the system 800 does not include an actuator (e.g., actuator 712 of system 700) to move the light filter 808. Instead, the system 800 includes the auxiliary light source 870 to illuminate a receive block (not shown) of the LIDAR device 804 with light having the source wavelength of light emitted by light source(s) (not shown) in the LIDAR device 804. For instance, such illumination may allow detectors in the receive block of the LIDAR device 804 to be viewable by the camera 806 through the light filter 808. Although the auxiliary light source 870 is shown to be mounted separately from other components of the system 800, in some examples, the auxiliary light source 870 may be alternatively mounted to any of the components of the system 800, such as the alignment apparatus 860, the light filter 808, the mounting platform 802, etc. In one example, the auxiliary light source 870 may be alternatively mounted inside the LIDAR device 804. Referring back to
As shown, the manipulator 862 has a different shape and structure than the manipulator 762 of the system 700. Referring back to
Thus, in some examples, the system 800 may perform similar functions as those described for the system 100 and the system 700, but may include some variations suitable for other LIDAR device configurations, such as the configuration of the LIDAR device 804.
In addition, for the method 900 and other processes and methods disclosed herein, the flowchart shows functionality and operation of one possible implementation of present embodiments. In this regard, each block may represent a module, a segment, a portion of a manufacturing or operation process, or a portion of program code, which includes one or more instructions executable by a processor for implementing specific logical functions or steps in the process. The program code may be stored on any type of computer readable medium, for example, such as a storage device including a disk or hard drive. The computer readable medium may include non-transitory computer readable medium, for example, such as computer-readable media that stores data for short periods of time like register memory, processor cache and Random Access Memory (RAM). The computer readable medium may also include non-transitory media, such as secondary or persistent long term storage, like read only memory (ROM), optical or magnetic disks, compact-disc read only memory (CD-ROM), for example. The computer readable media may also be any other volatile or non-volatile storage systems. The computer readable medium may be considered a computer readable storage medium, for example, or a tangible storage device.
In addition, for the method 900 and other processes and methods disclosed herein, each block in
At block 902, the method 900 involves mounting a transmit block and a receive block in a light detection and ranging (LIDAR) device to provide a relative position between the transmit block and the receive block. The transmit block may include one or more light sources configured to emit light at a source wavelength. The receive block may include one or more detectors configured to detect light at the source wavelength. In one embodiment, the source wavelength is 905 nm. In other embodiments, the source wavelength may be any other wavelength (e.g., infrared, ultraviolet, x-ray, visible, etc.).
By way of example, the transmit block and the receive block may be mounted by a robotic arm (e.g., alignment apparatuses 160, 760, 860, etc.) into a housing (e.g., housing 210, optics assembly 610, etc.) to provide the relative position. In this example, the relative position may be similar to the relative position between transmit block 220 and receive block 230 of
In some examples, the LIDAR device may include a lens configured to (i) collimate light emitted from the one or more light sources and (ii) focus light onto the one or more detectors, similarly to the lens 250 of the LIDAR device 200. In other examples, the LIDAR device may include a transmit lens for collimation of emitted light and a receive lens for focusing received light, similarly to the transmit lens 652 and the receive lens 654 of the LIDAR device 600.
At block 904, the method 900 involves locating a camera at a given position at which the camera, when focused at infinity, can image light beams emitted by the one or more light sources and can image the one or more detectors. By way of example, the given position may be similar to the position of the camera 706 of the system 700. For instance, a field-of-view of the camera may be aligned with a FOV of the LIDAR device where light emitted from the one or more light sources propagates away from the LIDAR device.
At block 906, the method 900 involves obtaining a first image indicative of light source positions of the one or more light sources. The first image, for example, may be obtained using the camera located at the given position and focused at infinity. Referring back to
In some examples, the method 900 may also involve obtaining the first image while the one or more light sources are emitting light at the source wavelength. For instance, a system performing the method 900 may provide power and/or instructions to the LIDAR device to emit the light at the source wavelength, and may provide instructions to the camera to capture the first image while the one or more light sources are emitting the light. Further, in some instances, the system may provide instructions to the camera to adjust the focus to infinity for the source wavelength. Alternatively, for instance, the camera may be configured to have the focus prior to capturing the first image.
In some examples, the method 900 may also involve obtaining the first image while a light filter is interposed between the camera and the one or more light sources. The light filter may be similar to the light filters 108, 708, or 808. In one example, the light filter may be configured to attenuate light having wavelengths other than the source wavelength. In this example, the first image may be more suitable for representing features of the emitted light beams at the source wavelength. In another example, the light filter may be configured to attenuate light within a wavelength range that includes the source wavelength. In this example, the light filter may reduce the intensity of the emitted light to protect components of the camera. Additionally or alternatively, in this example, the light filter may reduce an amount of light propagating toward the camera that has wavelengths proximal to the source wavelength. In turn, for instance, pixels in the first image representing the emitted light beams having the source wavelength may be easily contrasted from surrounding pixels having proximal wavelengths.
In some examples, the method 900 may also involve detecting a defect in a light source based on the first image. In one example, the first image may indicate that one or more of the light sources in the transmit block has a different intensity, brightness, color, or other characteristic compared to other light sources. For instance, a system performing method 900 may compare pixel properties (e.g., brightness, intensity, color, etc.) associated with one light source against other pixel properties in the first image associated with another light source. Alternatively, for instance, the system may compare the pixel properties of the light source with pre-determined pixel properties or pixel properties in a stored image among other possibilities.
In some examples, the method 900 may also involve detecting an aberration in an optical element optically coupled to a light source based on the first image. Referring back to
At block 908, the method 900 involves obtaining a second image indicative of detector positions of the one or more detectors. Similarly to the first image, the second image may be obtained using the camera located at the given position and focused at infinity. Referring back to
In some examples, the method 900 may also involve obtaining the second image while the one or more light sources are not emitting light at the source wavelength. For instance, a system performing the method 900 may reduce power to the one or more light sources and/or provide instructions to the LIDAR device to stop emitting light beams.
However, in other examples, the method 900 may involve obtaining the second image while the one or more light sources are emitting the light at the source wavelength. In one instance, a light filter may be interposed between the camera and the LIDAR device while the camera is capturing the second image, and the light filter may be configured to attenuate light at the source wavelength emitted by the one or more light sources. In this instance, the system performing the method 900 may then obtain the second image indicative of the detector positions while the light filter attenuates the light beams emitted by the one or more light sources. In another instance, the second image may indicate both the light source positions and the detector positions since the one or more light sources are emitting the light at the source wavelength when the second image is obtained. Other examples are possible as well.
In some examples, the method 900 may also involve obtaining the second image while the one or more detectors are illuminated with light at the source wavelength from an auxiliary light source. The auxiliary light source may be similar to the auxiliary light sources 170 and 870 included, respectively, in the systems 100 and 800. In one example, the camera may be focused at infinity for the source wavelength. In another example, a light filter interposed between the camera and the LIDAR device may be configured to attenuate light having wavelengths other than the source wavelength. In both examples, the auxiliary light source may illuminate the one or more detectors such that reflections of the illuminating light having the source wavelength are viewable by the camera when capturing the second image.
At block 910, the method 900 involves determining at least one offset based on the light source positions indicated by the first image and the detector positions indicated by the second image. In one example, the at least one offset may include distances between adjacent regions of the first image that are associated with particular light sources. In another example, the at least one offset may include distances between adjacent regions of the second image that are associated with particular detectors. In yet another example, the at least one offset may include an offset between a region of the first image associated with a given light source, and a corresponding region of the second image associated with a given detector. The offset in the third example may have a horizontal component and a vertical component, or may just be a distance between the respective regions (e.g., number of pixels). As a variation of the third example, the offset may also include a depth component where the camera is configured to obtain 3D images, for instance. Other offsets are possible as well.
In some examples, the method 900 may also involve generating a composite image based on overlaying the first image and the second image. In these examples, the at least one offset may be determined based on separation between one or more pixels in the composite image associated with a light source and one or more pixels in the composite image associated with a corresponding detector.
At block 912, the method 900 includes adjusting a relative position between the transmit block and the receive block based at least in part on the at least one offset. By way of example, a robotic arm or other device (e.g., alignment apparatuses 160, 760, 860, etc.) may couple to the transmit block and/or the receive block to adjust the relative position. The robotic arm, for instance, may translate the coupled component linearly and/or rotate the coupled component about an axis in line with the discussion for the systems 100, 700, and 800.
Accordingly, in some examples, the method 900 may also involve adjusting the relative position between the transmit block and the receive block by rotating the receive block about an axis. Further, in some examples, the method 900 may also involve adjusting the relative position by rotating the transmit block about an axis.
In some examples, adjusting the relative position between the transmit block and the receive block at block 912 reduces the at least one offset. For instance, where the at least one offset includes an offset between a light source and a corresponding detector, the adjustment of the relative position may reduce the offset to align the light source with the detector. In other examples, adjusting the relative position between the transmit block and the receive block at 912 causes the at least one offset to correspond to a particular offset. For instance, where the at least one offset includes an offset between two adjacent light sources, the adjustment of the relative position may cause the offset to correspond to the particular offset.
However, in some examples, light sources and detectors in a LIDAR device may not be aligned with one another due to manufacturing/assembly variability or other factors.
Although images 1200-1800 illustrate composite images overlaying a first image (e.g., image 1000, etc.) with a second image (e.g., image 1100), in some examples, the present method may determine the various offsets described above for images 1200-1800 without overlaying the two images. For instance, a computing device herein may determine the various offsets by comparing pixel locations in the first image 1000 with corresponding pixel locations in the second image 1100. Other image processing techniques are possible as well (e.g., filtering, transforms, etc.) for determining the at least one offset described at block 910 of the method 900.
Further, although images 1300-1800 illustrate scenarios where the light sources and the detectors are offset in only one DOF, in some examples, the light sources and the detectors may be offset in more than one DOF. For instance, a LIDAR device may have light sources and detectors that are offset in both the forward/backward DOF described in image 1500 and the roll DOF described in image 1600. Other offset combinations are possible as well.
Further, although images 1000-1800 represent light sources and detectors of a LIDAR device having a similar configuration to the LIDAR device 200, in some examples, similar images may be generated for any other LIDAR device configuration, such as the configuration of the LIDAR device 600 among other possibilities.
In addition to alignment of light sources and detectors of a LIDAR device, in some examples, the present method may facilitate diagnosis of the various components of the LIDAR device. As an example,
At block 2002, the method 2000 involves obtaining one or more images using a camera located at a given position at which the camera can image light beams emitted by one or more light sources in a transmit block of a LIDAR device and can image one or more detectors in a receive block of the LIDAR device. In some examples, the one or more images may include a single image that indicates both light source positions of the one or more light sources and detector positions of the one or more detectors. For instance, the single image may be similar to images 1200-1800 shown in
In some examples, the method 2000 may also involve causing an actuator to move a light filter to a first position where the light filter is interposed between the camera and the LIDAR device, obtaining a first image indicative of light source positions of the one or more light sources while the light filter is at the first position, causing the actuator to move the light filter to a second position where the light filter is outside a field-of-view of the camera, and obtaining a second image indicative of detector positions of the one or more detectors while the light filter is at the second position. Referring back to
At block 2004, the method 2000 involves adjusting a relative position between the transmit block and the receive block based on the one or more images. For instance, the adjustment at block 2004 may be similar to the adjustments described at block 912 of the method 900, along some or all the six DOF (e.g., up/down, left/right, forward/backward, roll, yaw, pitch) of the transmit block, the receive block, or both. Through this process, for instance, the present method may align the one or more light sources with the one or more detectors.
In some scenarios, the process described for yaw or pitch alignment in the description of images 1700 and 1800 is less suitable than other methods herein. As an example, the variation in distances between adjacent light sources indicated by a first image (e.g., image 1000) and the variation in distances between adjacent detectors indicated by a second image (e.g., image 1100) may be insufficient for detection of yaw or pitch offsets between the transmit block and the receive block. For instance, the one or more detectors (or the one or more light sources) may be arranged closely to one another. As another example, the transmit block and the receive block may only include, respectively, one light source and one detector, similarly to the transmit block 620 and the receive block 630 shown in
In one embodiment, the receive block may be coupled to a half-mirror positioned along a receive path of the receive block. Referring back to
In another embodiment, the method 2000 may additionally or alternatively involve actuating at least two probes adjacent to the receive block toward the receive block. A given probe (e.g., force sensor, proximity sensor, etc.) may be configured to provide a signal indicative of contact (or proximity) between the given probe and the receive block. In this embodiment, adjusting the relative position may comprise rotating the receive block (e.g., adjusting yaw, pitch, or roll of the receive block) such that at least two signals from the at least two probes are provided at a substantially same time. For instance, the at least two probes may have a particular yaw or pitch substantially similar to the yaw or pitch of the transmit block, and thus by providing the at least two signals at the substantially same time, the yaw or pitch of the receive block may also correspond to the yaw or pitch of the transmit block.
As shown, the LIDAR device 2104 has a configuration similar to the LIDAR device 200. For instance, as shown, the LIDAR device 2104 includes a transmit block 2120, light sources 2122a-c, receive block 2130, detectors 2132a-c, an exit aperture 2134, and a lens 2150 that are similar, respectively, to the transmit block 220, the light sources 222a-c, the receive block 230, the detectors 232a-c, the exit aperture 234, and the lens 250 of the LIDAR device 200. However, in some examples, the system 2100 may be adapted for use with other LIDAR device configurations such as the configuration of the LIDAR device 600, etc. Further, as shown, the system 2100 includes an actuator 2190 and probes 2192-2194.
The actuator 2190 may be configured to move the probes 2192 and 2194 toward the receive block 2130 in line with the discussion above at block 2004 of the method 2000. Example actuators may include motors, pneumatic actuators, hydraulic pistons, relays, solenoids, and piezoelectric actuators among other possibilities.
The probes 2192 and 2194 may include any probes suitable for detection of the receive block 2130 in line with the discussion above at block 2004 of the method 2000. In one example, the probes 2192-2194 may include force sensors that provide a signal if the probes 2192-2194 contact the receive block 2130. In another example, the probes 2192-2194 may include proximity sensors (e.g., IR range sensors, etc.) that provide a signal if the probes 2192-2194 are within a threshold distance to the receive block 2130.
In the illustration of
In line with the discussion above, the system 2100 provides an example embodiment for the rotational position alignment described at block 2004 of the method 2000. Consider an example scenario where the probes 2192-2194 have a particular yaw (i.e., amount of rotation about axis pointing out of page) as shown in
Continuing with the example scenario, the actuator 2190 may move the probes 2192-2194 toward the receive block 2130 in line with the discussion at block 2004 of the method 2000. On one hand, if the two probes 2192-2194 detect the receive block 2130 (e.g., provide a signal, etc.) at a substantially similar time, then the system 2100 may determine that the yaw of the receive block 2130 corresponds to the yaw of the probes 2192-2192, and therefore also corresponds to the yaw of the transmit block 2120. On the other hand, if the two probes 2192-2194 detect the receive block 2130 at substantially different times, then the system 2100 may determine that the yaw of the receive block 2130 does not correspond to the yaw of the probes 2192-2194, and therefore does not correspond to the yaw of the transmit block 2120. In this case, the system 2100 may then perform the adjustment of the relative position between the transmit block 2120 and the receive block 2130 in line with the discussion at block 912 of the method 900. Through this process, for example, the transmit block 2120 and the receive block 2130 may be aligned with one another, at least with regard to the yaw DOF.
As a variation of the example scenario, the probes 2192-2194 may be arranged vertically (e.g., along an axis pointing out of the page) instead of the arrangement shown in
As a further variation of the example scenarios above, the probes 2192-2194 may be alternatively moved by the actuator 2192 toward the transmit block 2120 instead of the receive block 2130. In this scenario, the system 2100 may adjust the position of the transmit block 2120 to align the pitch and/or yaw of the transmit block 2120 with the pitch and/or yaw of the receive block 2130.
As shown, the camera 2206 includes a lens 2210 and light sources 2212, 2214, 2216, and 2218. However, in some examples, the camera 2206 may include additional, fewer, or different components than those shown. In one example, the camera 2206 may alternatively not include the lens 2210. For instance, the camera 2206 in this example may be an image sensor configured to capture images without use of the lens 2210, among other possibilities. In another example, the camera 2206 may be alternatively configured not to include the light sources 2212-2218, or may be configured to include additional or fewer light sources than the four light sources 2212-2218 shown, in accordance with the discussion at block 2004 of the method 2000.
The lens 2210 may include one or more optical element (e.g., convex lens, concave lens, Fresnel lens, mirror, etc.) arranged to modify, condition, focus, and/or redirect light incident on the lens 2210 toward imaging sensors (not shown) of the camera 2206. In some examples, the lens 2210 may be configured to provide an infinity focus for incident light having a source wavelength. In one embodiment, the source wavelength is 905 nm. However, other source wavelengths are possible as well (e.g., infrared, ultraviolet, x-ray, visible, etc.).
The light sources 2212-2218 may include laser diodes, light emitting diodes (LEDs), vertical cavity surface emitting lasers (VCSEL), organic light emitting diodes (OLEDs), polymer light emitting diodes (PLED), light emitting polymers (LEP), liquid crystal displays (LCD), microelectromechanical systems (MEMS), filament light sources, or any other device configured to selectively transmit, reflect, and/or emit light propagating away from the camera 2206. In some examples, the light sources 2212-2218 may be configured to emit light at the source wavelength of light emitted by the LIDAR device (not shown) imaged by the camera 2206. Referring back to
As shown, the light sources 2212-2218 are positioned along a periphery of the camera lens 2210. However, in some examples, the light sources 2212-2218 may be alternatively positioned at a different location. In one example, referring back to
Thus, in some examples, the light sources 2212-2218 may be utilized by a system of the present method to determine the at least one offset in line with the discussion at block 910 of the method 900, and therefore facilitate the adjustment of the relative position between a transmit block and a receive block of a LIDAR device in line with the discussion at block 912 of the method 900. In one example, the light sources 2212-2218 may facilitate alignment of the rotational position (e.g., yaw, pitch, roll) of a transmit block and a receive block in line with the discussion at block 2004 of the method 2000.
As an example scenario for rotational position alignment using the camera 2206, suppose that the camera 2206 corresponds to the camera 706 of the system 700. In the scenario, the light sources 2212-2218 may be configured to emit light that propagates through the light filter 708 (e.g., if the light filter 708 is interposed between the camera 2206 and the LIDAR device 704 as illustrated in
Continuing with the scenario, a system of the present method may then analyze the image, in line with the discussion for images 1700-1800, using the properties (e.g., position, shape, intensity, etc.) of the light beams originating from the light sources 2212-2218 to determine yaw or pitch offsets between the transmit block and the receive block.
As a variation of the scenario above, suppose that the light sources 2212-2218 are alternatively positioned along a periphery of the transmit block 220 of the LIDAR device 200. In this scenario, four mirrors may be positioned at the positions shown for the light sources 2212-2218. In turn, the four mirrors may reflect the light from the light sources 2212-2218 towards the receive block 230 of the LIDAR device 200, and the camera 2206 may capture an image of reflections of that light off the entrance aperture 234 (e.g., half-mirror) to determine a possible yaw/pitch offset between the transmit block 220 and the receive block 230.
As another variation of the scenario, suppose that the light sources 2212-2218 are alternatively positioned along a periphery of the receive block 230 of the LIDAR device 200. In this scenario, a system of the present method may analyze an image of light from the light sources 2212-2218 in line with the discussion for images 1600-1800 to determine the rotational position (e.g., roll, yaw, pitch) of the receive block.
As yet another variation of the scenario, the light sources 2212-2218 may be alternatively positioned along a periphery of the transmit block 220 of the LIDAR device 200, and four additional similar light sources may be positioned along the periphery of the receive block 230 of the LIDAR device 200. Similarly here, one or more images of the light from the light sources 2212-2218 and the four additional light sources may be obtained from the camera 2206 and analyzed by the system of the present method to determine rotational offsets (e.g., roll, yaw, pitch) between the transmit block and the receive block. Thus, various configurations and positions are possible for the light sources 2212-2218 to facilitate optics alignment in line with the discussion above.
It should be understood that arrangements described herein are for purposes of example only. As such, those skilled in the art will appreciate that other arrangements and other elements (e.g. machines, interfaces, functions, orders, and groupings of functions, etc.) can be used instead, and some elements may be omitted altogether according to the desired results. Further, many of the elements that are described are functional entities that may be implemented as discrete or distributed components or in conjunction with other components, in any suitable combination and location, or other structural elements described as independent structures may be combined.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope being indicated by the following claims, along with the full scope of equivalents to which such claims are entitled. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting.
Claims
1. A system comprising:
- one or more light sources of a light detection and ranging (LIDAR) device configured to emit one or more light beams;
- one or more detectors of the LIDAR device configured to receive a light beam emitted from a corresponding light source of the one or more light sources upon the light beam being reflected from an object in an environment of the LIDAR device;
- a transmit lens of the LIDAR device configured to focus light beams emitted by the one or more light sources of the LIDAR device;
- a receive lens of the LIDAR device configured to focus light beams reflected from objects in the environment of the LIDAR device onto the one or more detectors of the LIDAR device; and
- an alignment apparatus configured to: adjust a position of at least one of the one or more light sources relative to the transmit lens or the receive lens by effecting a relative rotation between the at least one light source and the transmit lens or the receive lens in one of three rotational degrees of freedom; or adjust a position of at least one of the one or more detectors relative to the transmit lens or the receive lens by effecting a relative rotation between the at least one detector and the transmit lens or the receive lens in one of three rotational degrees of freedom.
2. The system of claim 1, further comprising a mounting platform, wherein the one or more light sources of the LIDAR device and the one or more detectors of the LIDAR device are mounted to the mounting platform.
3. The system of claim 2, wherein the mounting platform comprises plastics or metals.
4. The system of claim 2, wherein the one or more light sources of the LIDAR device and the one or more detectors of the LIDAR device are removably mounted to the mounting platform.
5. The system of claim 4, wherein the alignment apparatus is further configured to mount and unmount the one or more light sources of the LIDAR device and the one or more detectors of the LIDAR device to the mounting platform.
6. The system of claim 1, wherein the alignment apparatus comprises a robotic arm.
7. The system of claim 6, wherein the alignment apparatus is further configured to:
- translate at least one of the one or more light sources forward or backward along an axis perpendicular to the transmit lens or the receive lens; or
- translate at least one of the one or more detectors forward or backward along an axis perpendicular to the transmit lens or the receive lens.
8. The system of claim 1, wherein the alignment apparatus is further configured to:
- translate at least one of the one or more light sources up or down along an axis parallel to the transmit lens or the receive lens; or
- translate at least one of the one or more detectors up or down along an axis parallel to the transmit lens or the receive lens.
9. The system of claim 1, wherein the alignment apparatus is further configured to:
- translate at least one of the one or more light sources left or right along an axis parallel to the transmit lens or the receive lens; or
- translate at least one of the one or more detectors left or right along an axis parallel to the transmit lens or the receive lens.
10. The system of claim 1, wherein the alignment apparatus is configured to:
- independently translate at least one of the one or more light sources forward or backward along an axis perpendicular to the transmit lens or the receive lens; and
- independently translate at least one of the one or more detectors forward or backward along an axis perpendicular to the transmit lens or the receive lens.
11. A method comprising:
- emitting, by one or more light sources of a light detection and ranging (LIDAR) device, one or more light beams;
- focusing, by a transmit lens of the LIDAR device, light beams emitted by the one or more light sources of the LIDAR device;
- receiving, by one or more detectors of the LIDAR device, a light beam emitted from a corresponding light source of the one or more light sources upon the light beam being reflected from an object in an environment of the LIDAR device;
- focusing, by a receive lens of the LIDAR device, light beams reflected from objects in the environment of the LIDAR device onto the one or more detectors of the LIDAR device; and
- adjusting, by an alignment apparatus: a position of at least one of the one or more light sources relative to the transmit lens or the receive lens by effecting a relative rotation between the at least one light source and the transmit lens or the receive lens in one of three rotational degrees of freedom; or a position of at least one of the one or more detectors relative to the transmit lens or the receive lens by effecting a relative rotation between the at least one detector and the transmit lens or the receive lens in one of three rotational degrees of freedom.
12. The method of claim 11, further comprising mounting the one or more light sources of the LIDAR device and the one or more detectors of the LIDAR device to a mounting platform.
13. The method of claim 12, wherein the mounting platform comprises plastics or metals.
14. The method of claim 12, further comprising removably mounting the one or more light sources of the LIDAR device and the one or more detectors of the LIDAR device to the mounting platform.
15. The method of claim 14, further comprising mounting or unmounting, by the alignment apparatus, the one or more light sources of the LIDAR device and the one or more detectors of the LIDAR device to the mounting platform.
16. The method of claim 11, wherein the alignment apparatus comprises a robotic arm.
17. The method of claim 16, further comprising translating, by the alignment apparatus:
- at least one of the one or more light sources forward or backward along an axis perpendicular to the transmit lens or the receive lens; or
- at least one of the one or more detectors forward or backward along an axis perpendicular to the transmit lens or the receive lens.
18. The method of claim 11, further comprising translating, by the alignment apparatus:
- at least one of the one or more light sources up or down along an axis parallel to the transmit lens or the receive lens; or
- at least one of the one or more detectors up or down along an axis parallel to the transmit lens or the receive lens.
19. The method of claim 11, further comprising translating, by the alignment apparatus:
- at least one of the one or more light sources left or right along an axis parallel to the transmit lens or the receive lens; or
- at least one of the one or more detectors left or right along an axis parallel to the transmit lens or the receive lens.
20. The method of claim 11, further comprising independently translating, by the alignment apparatus:
- at least one of the one or more light sources forward or backward along an axis perpendicular to the transmit lens or the receive lens; and
- at least one of the one or more detectors forward or backward along an axis perpendicular to the transmit lens or the receive lens.
Type: Application
Filed: Oct 16, 2023
Publication Date: Feb 8, 2024
Inventors: Gaetan Pennecot (San Francisco, CA), Pierre-Yves Droz (Los Altos, CA)
Application Number: 18/487,538