VEHICLE MONITORING CAMERA AND VEHICLE MONITORING CAMERA SYSTEM
A monitoring camera for a vehicle acquires images of a distant area, a vicinity area and a near area in front or rear of the vehicle. The monitoring camera can include an imaging element and an image forming lens comprising multiple lenses each having a different focal length. Each image of an object passing through a different one of the lenses is formed on a corresponding different region of an imaging surface of the imaging element.
Latest Koito Manufacturing Co., Ltd. Patents:
This application claims priority from Japanese Patent Application No. 2009-110450, filed on Apr. 30, 2009, the entire contents of which are incorporated herein by reference.
BACKGROUND1. Technical Field
The present disclosure relates to a monitoring camera which detects an object such as another vehicle or an obstacle in the vicinity of a vehicle, and a vehicle monitoring camera system which is capable of controlling a light distribution of a lamp or a vehicle traveling safety assistant based on the detected object.
2. Related Art
Recently, a technology has been proposed that obtains an image of a peripheral area of a subject vehicle using a monitoring camera, detects an object such as another vehicle or an obstacle in the vicinity of the subject vehicle, and displays the object on a monitoring device or controls a light intensity or a light distribution of a vehicle lamp such as a head lamp or a rear lamp in accordance with the detected objects. In this kind of monitoring camera, it is desirable to monitor a wide range in the vicinity of the vehicle, but it is necessary to mount multiple monitoring cameras each of which obtains images of different respective areas about the vehicle. One problem is that the manufacturing cost of such a monitoring camera is relatively high. For this reason, there has been proposed a monitoring camera capable of monitoring multiple areas using one monitoring camera. For example, JP-A-2007-13549 proposes a monitoring camera that simultaneously obtains images of three areas (i.e., a left area, a right area, and a down area) in the vicinity of the vehicle by using one imaging element. The monitoring camera is located in a rear portion of the vehicle, and an imaging lens and an imaging element are provided in the monitoring camera so as to obtain images of objects. Also, a mirror is provided in the monitoring camera so as to divide an imaging area by three reflection surfaces of the mirror. The three reflection surfaces of the mirror have reflection optical axes respectively facing a left direction, a right direction, and a downward direction.
In the monitoring camera described in JP-A-2007-13549, it is possible to obtain images of three areas using one imaging element. However, since an image of each area is formed on the imaging element using one imaging lens, the imaging magnification of each area is the same (i.e., the close object appears large, and the distant object appears small). Thus, when the images of the objects are displayed on the monitoring device, it is difficult to recognize the distant object (which appears too small). In addition, when the object is image-recognized based on an imaging signal of the object, since the distant object appears small, the resolution of the imaging element is low. Hence, in some cases, the distant object may not be detected through the image recognition. As a result, it often is not possible to monitor the distant area sufficiently, and thus it is not possible to exhibit a function of the monitoring camera sufficiently.
On the other hand, when an imaging lens having a long focal length is used, it is possible to obtain an image of the distant object so that it appears large, and also to increase the resolution of the imaging element. Accordingly, it is possible to recognize or detect the distant object. However, in this case, since the imaging viewing angle is narrow, an object at the side of the monitoring camera and particularly an object close to the monitoring camera are not included in the image viewing angle. Accordingly, it is hard to detect the objects through an imaging operation. For this reason, in the system in which the lighting operation of the vehicle is controlled by detecting the objects, illumination light emitted from the vehicle may result in glare to the adjacent vehicle, and hence the possibility of causing a critical accident may increase.
SUMMARYExemplary embodiments of the present invention address the above problems, as well as other problems not described above. However, the present invention is not required to overcome the problems described above, and thus some embodiments of the present invention may not overcome any of the problems described above.
In some implementations of the invention, a vehicle monitoring camera is capable of obtaining images of objects in a vicinity area or a near area at a wide range while obtaining images of objects in a distant area at a large size by using a single camera.
In some implementations of the invention, e a vehicle monitoring camera system is capable of controlling a lighting operation of a vehicle lamp or a vehicle traveling safety assistant based on objects detected from an image acquired by a monitoring camera.
According to one or more aspects of the present invention, a monitoring camera for a vehicle obtains images of a distant area, a vicinity area and a near area in front of or behind the vehicle. The monitoring camera includes an imaging element and an image forming lens. The lens comprises a plurality of lenses each having a different focal length. Each image of the objects passing through a corresponding one of the lenses is formed on a different region of an imaging surface of the imaging element.
According to one or more aspects of the present invention, the image forming lens includes a first lens having a first focal length that obtains images of the distant area in front of or behind the vehicle, and a second lens having a second focal length that obtains images of the vicinity area or the near area in front of or behind the vehicle. The second focal length is shorter than the first focal length.
According to one or more aspects of the present invention, a monitoring camera system includes a monitoring camera, a detecting unit that detects objects in front of or behind the vehicle based on images obtained by the monitoring camera, and a lighting control unit that controls a lighting state of a vehicle lamp based on the detected objects.
Even in the case of one imaging device including one imaging element, an image of the distant object is captured by the lens having a long focal length so that an image of the object is obtained at a large magnification, whereas images of the vicinity or near area can be obtained at a wide range by using a lens having a short focal length. Accordingly, it is possible to obtain an image of the distant object at a high resolution as well as to detect the object reliably. In addition, it is possible reliably to detect an object in the vicinity or near area and thus to provide a monitoring camera which can reliably detect an object such as another vehicle in the vicinity of the vehicle.
As it is possible reliably to detect an object such as another vehicle in the vicinity of the vehicle using the monitoring camera, it is possible to control the lighting state of the rear lamp or the head lamp of the vehicle based on the detection. Accordingly, it is possible to improve the visibility of the vehicle with respect to another vehicle or to prevent (or reduce) glare to another vehicle. It also is possible to control a vehicle traveling safety assistant such as a speed control operation or a steering operation in a traveling mode of the vehicle based on the detection. Accordingly, it is possible to achieve control such that safer traveling state is ensured.
Other aspects and advantages of the present invention will be apparent from the following description, the drawings and the claims.
An exemplary embodiment of the invention is described with reference to the drawings.
In the imaging device CAM1, the objects respectively formed on the lenses Lf, Lm and Ln of the multi image forming lens 33 are focused on the imaging surface of the imaging element 32 by the focal lens 34, and images of the objects are formed, thereby providing an imaging signal. As shown in an imaging area (PA) of the imaging element 32 of
The image captured by the imaging device CAM1 is displayed on a monitoring device MON connected to the object detecting unit 21, as shown in
The imaging signal of the object imaged by the imaging device CAM1 is image-recognized by the object detecting unit 11, so that the objects respectively in the distant area Af, the vicinity area Am, and the near area An are detected. In some known techniques, the object in the distant area Af was imaged as a small image. However, since the distant imaging area PAf of the imaging element is able to obtain an image of the object in the distant area Af at a large imaging magnification by using the long focal far area lens Lf, the number of pixels of the imaging element 32 used to obtain an image of the object in the distant area increases. Hence, it is possible to detect the object through high resolution image recognition. On the other hand, in the near imaging area PAn from the vicinity imaging area PAm of the imaging element 32, since it is possible to obtain an image of an object at a wide viewing angle by using the lenses Lm and Ln having a standard focal length or a focal length shorter than the standard focal length, it is possible to detect an object in the side of the subject vehicle as well as an object behind the vehicle through the image recognition. At this time, since the object is in the vicinity of the vehicle, even when the object is imaged by the short focal lens, it is possible to obtain high resolution at which the object is imaged at a size sufficient for image recognition. Accordingly, it is possible to obtain an image of the objects in the periphery of the vehicle from the far area Af to the vicinity area Am or the near area An at a high resolution. As a result, it is possible to detect the object through the image recognition.
When the object detecting unit 11 detects the object so as to perform image recognition based on the image acquired by the imaging operation from the distant area Af to the near area An, the lighting control unit 12 controls a lighting operation of the rear lamp RL or the high mount stop lamp HMSL based on detection of the object. The lighting control unit 12 is configured to turn on the tail lamp of the rear lamp RL when the lamp switch is turned on by a driver, and to control a lighting operation of the stop lamp of the rear lamp RL and the high mount stop lamp HMSL when the brake pedal is operated by the driver. In addition, the lighting control unit 12 is configured to flicker a hazard lamp (e.g., a turn signal lamp) of the rear lamp RL when a hazard lamp switch is operated by the driver.
When the object detecting unit 11 detects another vehicle based on the acquired image, the lighting control unit 12 automatically controls the lighting operation of the rear lamp RL or the high mount stop lamp HMSL. See, for example,
If the object detecting unit 11 detects another vehicle in the vicinity area Am when the rear lamp RL is turned on in a high speed traveling mode or a middle speed traveling mode (S13), the lighting control unit 12 controls turning on of the stop lamp of the rear lamp RL or the high mount stop lamp HMSL (S14). Accordingly, it is possible more easily to detect the presence of another vehicle coming close to the vehicle. In addition, if the object detecting unit 11 detects another vehicle coming close to the near area An even after performing the above-described lighting control (S15), the lighting control unit 12 increases the light intensity of the stop lamp of the rear lamp RL or the high mount stop lamp HMSL (S16). Particularly, in detecting another vehicle in the near area An immediately after the vehicle in a high speed traveling mode, control operation is performed so as to increase the light intensity of the lamps and simultaneously flicker the hazard lamp of the rear lamp RL so as to provide an alarm to another vehicle coming close to the vehicle (S16). Likewise, when the lighting operation of the hazard lamp or the stop lamp of the rear lamp RL or the high mount stop lamp HMSL is controlled in accordance with the distance between the vehicle and another vehicle by detecting another vehicle located in the periphery of the vehicle and located anywhere from the distant area Af to the near area An based on the image acquired by the imaging device CAM1, it is possible to ensure safer traveling by increasing a recognition level of the vehicle with respect to another vehicle so as to prevent an accident.
According to the first embodiment, in the imaging device CAM1, an image of the object of the distant area Af is obtained by the multi image forming lens 33 at an enlarged size, and an image of the object in the wide range from the vicinity area Am to the near area An is obtained. Accordingly, based on the images acquired by the imaging device CAM1, it is possible to detect the object in the distant rear side, the near rear side, or the side of the vehicle in a high speed traveling mode, a middle speed traveling mode, or a low speed traveling mode at a high resolution without missing the object, and thus reliably to detect the object (e.g., another vehicle). Accordingly, it is possible to ensure safer traveling of the vehicle in such a manner that the presence of the vehicle is recognized by another vehicle through the lighting operation of the vehicle, or an alarm is provided to another vehicle.
Second EmbodimentThe imaging device CAM2 is configured to obtain an image of each of two areas, that is, a distant area Af and a vicinity/near area Amn in front of the vehicle CAR, as shown in
In the imaging device CAM2 according to the second embodiment, as shown in
Assuming that the image acquired by the imaging device CAM2 is displayed on the monitoring device, as shown in
The imaging signal of the object acquired by the imaging device CAM2 is image-recognized by the object detecting unit 11, so that the objects present respectively in the distant area Af and the vicinity/near area Amn are detected. In some known techniques, the object in the distant area Af was imaged as a small image. However, since the distant imaging area PAf of the imaging element 42 is able to capture the image of the object in the distant area Af at a large imaging magnification, it is possible to detect the distant object through the high resolution image recognition. On the other hand, in the vicinity/near imaging area PAmn of both sides of the imaging element 42, it is possible to detect the object over a wide area from the front side of the vehicle to the side thereof through image recognition.
Likewise, the object detecting unit 11 detects the object by performing the image recognition based on the image obtained from the distant area Af to the vicinity/near area Amn. The lighting control unit 12 controls a lighting operation of the low beam lamp or the high beam lamp of the left and right head lamps HL of the vehicle when the lamp switch is turned on by the driver. In addition, for example, as shown in the lighting control flowchart of
On the other hand, when the object detecting unit 11 detects another vehicle in the distant area (S24), the lighting control unit 12 performs a control operation to decrease the light intensity of the high beam lamp (S25). Accordingly, it is possible to maximize illumination over a wide range in front of the vehicle without producing glare to the oncoming vehicle or the preceding vehicle in the distant area Af. On the other hand, if the object detecting unit 11 detects another vehicle in the vicinity/near area Amn when the high beam lamp is turned on (S26), the lighting control unit 12 performs a control operation to f turn off the high beam lamp (S27). Accordingly, it is possible to prevent causing glare to the oncoming vehicle or the preceding vehicle.
According to the second embodiment, in the imaging device CAM2, the object in the distant area Af is enlarged and imaged by the multi image forming lens 43, the object present over a wide range of the vicinity/near area Amn is imaged, and then the object in the distant front side, the near front side, or the side of the vehicle is imaged at a high resolution, thereby reliably detecting the object through image recognition. Accordingly, it is possible to ensure safer traveling of the vehicle by preventing (or reducing) glare to the other detected vehicle by the lighting operation of the lamp of the vehicle.
In the first and second embodiments, all images acquired by the imaging element are displayed simultaneously on the monitoring device, or the object is detected through the image recognition, but the object detecting unit can be configured to detect the object by extracting the imaging signal of only a part of all pixels of the imaging element. For example, in the first embodiment, the imaging signal can be extracted from only the distant imaging area PAf of the entire imaging area PA of the imaging element 32, only the image of the distant area Af is acquired, and only the object in the far area Af is detected. The same applies to the vicinity imaging area PAm or the near imaging area PAn. Likewise, when the independent imaging operation is performed for each area, it is possible to detect more precisely the position or state of the object.
In the first embodiment, the lighting state of the rear lamp of the vehicle is controlled by detecting another vehicle in an area behind the vehicle. In the second embodiment, the lighting state of the head lamp of the vehicle is controlled by detecting another vehicle in an area in front of the vehicle. Both the configuration of the first embodiment and the configuration of the second embodiment can be provided in the vehicle. In addition, in the first embodiment, the imaging device is provided in the high mount stop lamp, and in the second embodiment, the imaging device is provided in the head lamp. In some cases, the imaging device is disposed in both the rear portion and the front portion of the vehicle independently of the lamp. In the case where the imaging device is provided in the lamp, if the imaging optical axis is aligned to the lamp optical axis when the imaging device is provided in the lamp, it is not necessary to perform optical axis adjustment of the imaging optical axis just by performing optical axis adjustment of the lamp optical axis. In addition, in the case where the imaging device is provided in the lamp, the lamp is not limited to the high mount stop lamp or the head lamp of the first embodiment.
Further, in the first and second embodiments, as an example, the image forming lens can be a multi image forming lens integrally formed with multiple lenses having different focal lengths. However, the image forming lens also can be formed as multiple independent lenses. In addition, each area of the imaging surface of the imaging element, that is, the distant imaging area, the vicinity imaging area, the near imaging area, and the vicinity/near imaging area are not limited to the particular shape of the first and second embodiments, but may be set to an arbitrary shape. Further, the acquired image can be displayed on a monitoring device, or can be displayed on a navigation screen of a navigation system as the monitoring device.
Embodiments of the invention can be applied to a monitoring camera which detects an object existing from a distant area to a near area of a vehicle using an imaging device. In addition, embodiments of the invention may be applied to a vehicle monitoring camera system that detects an object based on an image acquired by an imaging device and controls a lighting state of a vehicle based on the detected object.
Although the present invention has been shown and described with reference to certain exemplary embodiments thereof, other implementations are within the scope of the claims. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.
Claims
1. A monitoring camera for a vehicle that images a distant area, a vicinity area and a near area in front of or behind the vehicle, the monitoring camera comprising:
- an imaging element; and
- an image forming lens comprising a plurality of lenses each having a different focal length,
- wherein the monitoring camera is configured so that each image of an object passing through a different one of the lenses is formed on a corresponding different region of an imaging surface of the imaging element.
2. The monitoring camera according to claim 1, wherein the image forming lens comprises:
- a first lens having a first focal length to obtain an image of the distant area in front of or behind the vehicle; and
- a second lens having a second focal length to obtain an image of the vicinity area or the near area in front of or behind the vehicle, wherein the second focal length is shorter than the first focal length.
3. The monitoring camera according to claim 2, wherein the monitoring camera is mounted in a vehicle lamp.
4. A monitoring camera system comprising:
- a monitoring camera for a vehicle that images a distant area, a vicinity area and a near area in front of or behind the vehicle, the monitoring camera comprising: an imaging element; and an image forming lens comprising a plurality of lenses each having a different focal length, wherein the monitoring camera is configured so that each image of an object passing through a different one of the lenses is formed on a corresponding different region of an imaging surface of the imaging element,
- the monitoring camera system further comprising:
- a detecting unit configured to detect objects in front of or behind the vehicle, based on images acquired by the monitoring camera; and
- a lighting control unit configured to control a lighting state of a vehicle lamp based on the detected objects.
Type: Application
Filed: Apr 20, 2010
Publication Date: Nov 4, 2010
Patent Grant number: 8830324
Applicant: Koito Manufacturing Co., Ltd. (Tokyo)
Inventors: Osamu Endo (Shizuoka), Naoki Tatara (Shizuoka)
Application Number: 12/763,243
International Classification: B60Q 1/115 (20060101); H04N 7/18 (20060101);