IMAGE CAPTURE DEVICE

The present disclosure is intended to provide an image capture device having high resolution in reconstruction. The image capture device of the present disclosure, which is capable of recording light information including a traveling direction of light and intensity of the light in the traveling direction, includes a main lens; an image sensor; a microlens array that is placed between the main lens and the image sensor and has a predetermined vertical rotation angle relative to the image sensor; and a signal processing unit for generating a refocused image on a virtual image plane at any given focal position using the light information.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
BACKGROUND

1. Technical Field

The present disclosure relates to a light-field camera that extracts and records the direction of light rays using microlenses.

2. Description of Related Art

In recent years, a refocusable light-field camera has been available that has an integrated optical system and image sensor, focuses on a desired position after shooting, and generates an image at any given focal position. A light-field camera is disclosed, for example, in the Non-Patent Literature “Ren. Ng, et al, “Light Field Photography with a Hand-Held Plenoptic Camera”, Stanford Tech Report CTSR 2005-2”.

The light-field camera is comprised of a main lens, a microlens array, and an image sensor. Light from a subject passes through the main lens and then through the microlens array and is incident on the image sensor. Unlike a typical camera, a light-receiving surface of the image sensor includes information about a traveling direction of the light as well as the intensity of the light because the directions of the light are identified and recorded in the image sensor.

As such, refocusing can be made that generates an image at any given focal position after shooting. For example, projection from pixels that convert light received in the image sensor to electrical signals onto a virtual image plane in the direction of light rays enables a refocused image for the case of the image sensor being placed on the virtual image plane to be generated.

SUMMARY

The present disclosure provides an image capture device that improves image resolution when a refocused image at any given focal position is generated by reconstructing an image with a light-field camera.

The image capture device of the present disclosure, which is capable of recording light information including a traveling direction of light and intensity of the light in the traveling direction, includes a main lens, an image sensor, a microlens array that is placed between the main lens and the image sensor and has a predetermined vertical rotation angle relative to the image sensor, and a signal processing unit for generating a refocused image on a virtual image plane at any given focal position using the light information.

The image capture device of the present disclosure can improve the image resolution when the refocused image is generated by reconstructing an image with the light-field camera.

BRIEF DESCRIPTION OF DRAWINGS

FIG. 1 is a block diagram showing a configuration of a light-field camera according to a first exemplary embodiment.

FIG. 2 shows a distribution of ray centers on a virtual image plane.

FIG. 3 is a diagram illustrating a method of calculating a position of a ray center on the virtual image plane according to the first exemplary embodiment.

FIG. 4 is a flow chart showing a procedure for calculating a cost value using a cost function with respect to any given vertical rotation angle of a microlens array relative to an image sensor according to the first exemplary embodiment.

FIG. 5 is a diagram illustrating an example of the cost function according to the first exemplary embodiment.

FIG. 6 shows mean values of the cost function for vertical rotation angles of the microlens array relative to the image sensor according to the first exemplary embodiment.

FIG. 7 shows mean values of the cost function for refocusing distances when vertical rotation angles of the microlens array relative to the image sensor are 0 degrees and 6.6 degrees according to the first exemplary embodiment.

FIG. 8 shows a distribution of the ray centers at a refocusing distance of about +4.5 according to the first exemplary embodiment.

FIG. 9 shows a distribution of the ray centers at a refocusing distance of about +2.5 according to the first exemplary embodiment.

FIG. 10 shows a distribution of the ray centers at a refocusing distance of about −3.5 according to the first exemplary embodiment.

FIG. 11 shows mean values of the cost function for vertical rotation angles of another microlens array relative to the image sensor according to the first exemplary embodiment.

FIG. 12 shows mean values of the cost function for vertical rotation angles of yet another microlens array relative to the image sensor according to the first exemplary embodiment.

DESCRIPTION OF EMBODIMENT

An exemplary embodiment will now be described in detail with reference to the drawings as appropriate. However, unnecessarily detailed description may occasionally be omitted. For example, detailed description of well-known matters and redundant description of substantially the same configurations may occasionally be omitted. The omission of these items is to avoid the following description from becoming unnecessarily redundant, and to ease understanding of those skilled in the art.

Note that, the following description and the accompanying drawings are provided to allow any person skilled in the art to fully understand the present disclosure, and that it is not intended to limit the subject matter described in the claims by the following description and the accompanying drawings.

First Exemplary Embodiment

A first exemplary embodiment is described below with reference to FIGS. 1 to 12.

[1-1 Relationship Between Ray Centers and Image Resolution]

The relationship between ray centers and image resolution is first described. When a light-field camera processes pixels that convert light received in an image sensor to electrical signals, reconstructs an image, and generates a refocused image, ray centers play an important role. A “ray center” is a point at which a light ray projected from the image sensor onto a virtual image plane in a direction of the light ray intersects the virtual image plane where an image is reconstructed according to any given focal position. Thus, each of pixels of the image to be reconstructed is complemented using a ray center in the vicinity of each pixel in the image to be reconstructed of ray centers on the virtual image plane projected from the image sensor onto the virtual image plane in the direction of light rays. Here, a number of the ray centers is fixed by a number of pixels of the image sensor so that if the ray centers on the virtual image plane gather at a certain point, the image resolution in reconstruction is reduced in a region where density indicating a degree of gathering of the ray centers on the virtual image plane is low.

[1-2 Configuration of Light-Field Camera]

A light-field camera as an image capture device is described in the first exemplary embodiment. FIG. 1 shows a configuration of a light-field camera according to the first exemplary embodiment. Referring to FIG. 1, light-field camera 100 captures subject 101. Light-field camera 100 includes imaging unit 310 and signal processing unit 320. Imaging unit 310 includes main lens 102, microlens array 103, and image sensor 104. Signal processing unit 320 is fabricated using a processor, such as an LSI (Large Scale Integration).

Light passing through subject 101 passes through main lens 102 and microlens array 103 and is recorded by image sensor 104, at which time not only the intensity of the light but also a traveling direction of the light is recorded simultaneously in each of pixels of image sensor 104.

The pixels, which convert the light received in image sensor 104 to electrical signals, transmit the electrical signals to signal processing unit 320. When virtual image plane 105 is provided by virtually disposing image sensor 104 on any given plane in a space in order to reconstruct an image to be captured at any given focal position, signal processing unit 320 calculates positions of ray centers 106 projected from the pixels of image sensor 104 onto virtual image plane 105 in the direction of light rays. The image is then reconstructed using ray centers 106 and thus a refocused image is generated on virtual image plane 105.

FIG. 2 shows a distribution of ray centers on the virtual image plane. Microlens array 103 having a diameter of about 18 pixels is arranged in a honeycomb structure and image sensor 104 has a Bayer array. Distribution 210 of the ray centers represents a distribution of ray centers of light-field camera 100 according to the present embodiment.

In FIG. 2 showing distribution 210 of the ray centers, line A-A connecting centers of three adjacent microlenses 201 of microlenses 201 making up microlens array 103 and horizontal direction B-B of image sensor 104 are not parallel to each other, i.e., they have a predetermined angle between them, and line A-A connecting the centers of three adjacent microlenses 201 of microlenses 201 is inclined with respect to horizontal direction B-B of image sensor 104. That is, microlens array 103 has a predetermined vertical rotation angle relative to image sensor 104.

In FIG. 2, distribution 220 of the ray centers represents a distribution of ray centers of a conventional light-field camera. As for distribution 220 of the ray centers, line C-C connecting centers of three adjacent microlenses 201 of microlenses 201 making up microlens array 103 and horizontal direction B-B of image sensor 104 are parallel to each other and are not inclined. That is, the microlens array is not rotated vertically relative to the image sensor.

When distribution 210 of the ray centers is compared with distribution 220 of the ray centers in FIG. 2, distribution 210 of the ray centers is characterized in that the ray centers are less overlapped and a density of the ray centers is high on virtual image plane 105. High density of the ray centers on virtual image plane 105 increases image resolution when the image is reconstructed and a refocused image is generated. That is, microlens array 103 has a vertical rotation angle relative to image sensor 104 so that the density of the ray centers projected onto virtual image plane 105 is higher.

In light-field camera 100 configured as above, an optimum rotation angle of microlens array 103 relative to image sensor 104 has been calculated, which is described below.

[1-3 Optimum Rotation Angle]

[1-3-1 Position of Ray Center]

A method of calculating a position of a ray center on virtual image plane 105 is first described.

FIG. 3 is a diagram illustrating the method of calculating the position of a ray center on the virtual image plane according to the first exemplary embodiment. In FIG. 3, “b” is a distance between microlens array 103 and image sensor 104; “bb” is a distance between microlens array 103 and virtual image plane 105.

A coordinate of center position 402, i.e., a position at which a dashed line extending horizontally from a center position of i-th microlens 401 of microlens array 103 toward image sensor 104 intersects with image sensor 104, is represented as follows:


(MLA[i]·Cx,MLA[i]·Cy)  [EQ 1]

Pixel 403 is any given pixel of image sensor 104, and direction vector 404 from center position 402 to pixel 403 is represented as follows:


(DirX,DirY)  [EQ 2]

Where, “d” is a diameter of i-th microlense 401.

Assuming the light passes through i-th microlens 401 and travels in a straight line, coordinate 405 of the ray center projected onto virtual image plane 105 from pixel 403 of image sensor 104 is represented as follows:

( MLA [ i ] · Cx + bb b DirX , MLA [ i ] · Cy + bb b DirY ) [ EQ 3 ]

Where, “dd” is a width of a light ray on virtual image plane 105. Assuming the light ray captured through diameter d of i-th microlens 401 is collected in pixel 403 of image sensor 104, dd is equal to d. Thus, the position of the ray center and the width of the light ray projected from each pixel of image sensor 104 onto virtual image plane 105 can be calculated.

[1-3-2. Calculation of Cost Value]

Evaluation is performed using a cost function in order to calculate an optimum vertical rotation angle of microlens array 103 relative to image sensor 104. Specifically, the vertical rotation angle of microlens array 103 relative to image sensor 104 is varied from 0 to 30 degrees in increments of 0.1 degrees, and a cost value is calculated for each rotation angle using the cost function. The optimum vertical rotation angle of microlens array 103 relative to image sensor 104 has been found through the evaluation using the cost values calculated.

Here, a procedure is described that calculates a cost value using the cost function with respect to any given vertical rotation angle of microlens array 103 relative to image sensor 104. FIG. 4 is a flow chart showing the procedure for calculating the cost value using the cost function with respect to any given vertical rotation angle of microlens array 103 relative to image sensor 104.

The cost values for any given vertical rotation angle of microlens array 103 relative to image sensor 104 are calculated for all of virtual image planes envisioned. All of the virtual image planes 105 envisioned are all virtual image planes 105 at refocusing distances that are determined at predetermined intervals within a predetermined focal length from image sensor 104.

(S501) Cost values, variables, etc. of the cost function are first initialized to zero.

(S502) It is then determined whether processing for all of the virtual image planes envisioned has been completed. When all processing has been completed (when Yes), cost values and angles at which the cost values has been calculated are output and the process is terminated. When processing of all the virtual image planes envisioned is not completed (when No), the process proceeds to step S503.

(S503) A virtual image plane at a refocusing distance of interest is set at the predetermined interval.

(S504) It is then determined whether the cost values are calculated for all pixels within a specified range on the virtual image plane set. When the cost values have been calculated for all the pixels (when Yes), the process returns to step S502. When the cost values are not calculated for all the pixels within the specified range (when No), the process proceeds to step S505.

(S505) A position of a pixel within the specified range on the virtual image plane for which calculation is not performed is obtained.

(S506) Step 506 searches for a ray center nearest to the position of the pixel obtained in step S505 with regard to the positions of the ray centers that are projected from the pixels of image sensor 104 onto virtual image plane 105 based on pre-calculation and identifies a position of the ray center.

(S507) Step 507 obtains a distance between the position of the ray center determined in step S506 and the position of the pixel obtained in step S505 as the cost value and returns to step S504.

[1-3-3. Example of Cost Function]

The cost function to calculate a cost value will now be described in detail. FIG. 5 is a diagram illustrating an example of the cost function according to the first exemplary embodiment. In FIG. 5, R is a specified range on virtual image plane 105 to identify a range in which the cost function is calculated, and a position of r-th pixel P(r) of interest within specified range R is defined as follows:


P(r)=(xr,yr)  [EQ 4]

Representing a position of virtual image plane 105 as f, position Ray(f, n) of a ray center when n-th pixel of image sensor 104 is projected onto virtual image plane 105 at position f is defined as follows:


Ray(f,n)=(xf,n,yf,n)  [EQ 5]

Here, if distance Dist(P(r), Ray(f, n)) between position P(r) of the pixel of interest and position Ray(f, n) of the ray center is defined, for example, as a minimum distance square error, distance Dist(P(r), Ray(f, n)) is represented as follows:


Dist(P(r),Ray(f,n))=(xr−xf,n)2+(yr−yf,n)2  [EQ 6]

Cost function Cost(focus, R, N) can be defined as follows:

Cost ( focus , R , N ) = f focus r R min n N ( Dist ( p ( r ) , Ray ( f , n ) ) ) [ EQ 7 ]

where focus is a set of all the virtual image planes envisioned, R is the specified range, and N is a set of all pixels of image sensor 104.

The cost function defined by EQ 7 is equivalent to evaluating the distances between the respective positions of the pixels to be reconstructed and positions of the ray centers used for reconstruction within specified range R, i.e., within a range of the image to be reconstructed for all the virtual image planes 105 envisioned. EQ 7 is based on the idea that the closer overall distances between the positions of the pixels to be reconstructed and the positions of the ray centers are, the higher the density of the ray centers on virtual image plane 105 is and the higher resolution can be obtained.

[1-3-4. Calculation of Optimum Rotation Angle]

The optimum vertical rotation angle of microlens array 103 relative to image sensor 104 has been found using cost function defined by EQ 7, which is described with reference to FIGS. 6 to 10.

FIG. 6 shows mean values of the cost function for vertical rotation angles of microlens array 103 relative to image sensor 104. In FIG. 6, the horizontal axis represents the vertical rotation angle of microlens array 103 relative to image sensor 104 and the vertical axis represents the mean value of the cost function.

The mean value of the cost function is a value obtained by dividing the calculated cost value by a number of pixels on virtual image plane 105 used for calculation. The lower the mean value of the cost function is, the image can be reconstructed in higher resolution.

In FIG. 6, the mean values of the cost function are plotted in increments of 0.1 degrees from 0 to 30 degrees of the vertical rotation angle of microlens array 103 relative to image sensor 104, resulting that the mean value of the cost function is highest when the vertical rotation angle of microlens array 103 relative to image sensor 104 is 0 degrees; the mean value of the cost function is lowest when the vertical rotation angle of microlens array 103 relative to image sensor 104 is 6.6 degrees. When the vertical rotation angle of microlens array 103 relative to image sensor 104 is between about 1.7 degrees and about 28.3 degrees, the mean value of the cost function is lower than that at a vertical rotation angle of microlens array 103 relative to image sensor 104 of 0 degrees so that the image can be reconstructed in high resolution. Additionally, the mean values of the cost function are local minima when vertical rotation angles of microlens array 103 relative to image sensor 104 are about 1.7 degrees, about 6.6 degrees, about 12.4 degrees, about 17.6 degrees, and about 23.4 degrees, thereby enabling the image to be reconstructed in higher resolution.

The relationship of the mean value of the cost function with respect to the refocusing distance, i.e., a distance from image sensor 104 to virtual image plane 105, when vertical rotation angles of microlens array 103 relative to image sensor 104 are about 0 degrees and about 6.6 degrees will now be described.

FIG. 7 shows mean values of the cost function for refocusing distances when vertical rotation angles of microlens array 103 relative to image sensor 104 are 0 degrees and 6.6 degrees. The refocusing distance on the horizontal axis represents a relative distance when the distance between microlens array 103 and image sensor 104 is assumed to be 1. FIG. 7 is a graph showing mean values of the cost function when the refocusing distance is varied from −5 to +5, where an interval between refocusing distances −1 and +1 is not evaluated. Distributions of the ray centers on virtual image plane 105 for refocusing distances of about +4.5, about +2.5, and about −3.5 depicted in FIG. 7 are described.

FIG. 8 shows a distribution of the ray centers at a refocusing distance of about +4.5. The left-hand side of FIG. 8 shows a distribution of the ray centers when the vertical rotation angle of microlens array 103 relative to image sensor 104 is 6.6 degrees; the right-hand side of FIG. 8 shows a distribution of the ray centers when the vertical rotation angle of microlens array 103 relative to image sensor 104 is 0 degrees. A mean value of the cost function is 1.4 when the vertical rotation angle of microlens array 103 relative to image sensor 104 is 6.6 degrees; a mean value of the cost function is 4.8 when the vertical rotation angle of microlens array 103 relative to image sensor 104 is 0 degrees.

FIG. 9 shows a distribution of the ray centers at a refocusing distance of about +2.5. The left-hand side of FIG. 9 shows a distribution of the ray centers when the vertical rotation angle of microlens array 103 relative to image sensor 104 is 6.6 degrees; the right-hand side of FIG. 8 shows a distribution of the ray centers when the vertical rotation angle of microlens array 103 relative to image sensor 104 is 0 degrees. A mean value of the cost function is 1.5 when the vertical rotation angle of microlens array 103 relative to image sensor 104 is 6.6 degrees; a mean value of the cost function is 1.1 when the vertical rotation angle of microlens array 103 relative to image sensor 104 is 0 degrees.

FIG. 10 shows a distribution of the ray centers at a refocusing distance of about −3.5. The left-hand side shows a distribution of the ray centers when the vertical rotation angle of microlens array 103 relative to image sensor 104 is 6.6 degrees; the right-hand side shows a distribution of the ray centers when the vertical rotation angle of microlens array 103 relative to image sensor 104 is 0 degrees. A mean value of the cost function is 1.4 when the rotation angle of microlens array 103 is 6.6 degrees; a mean value of the cost function is 3.9 when the vertical rotation angle of microlens array 103 relative to image sensor 104 is 0 degrees.

FIGS. 8 to 10 suggest the following: When the distribution of the ray centers having a mean value of the cost function of 1.4 is compared with the distribution of the ray centers having a mean value of the cost function of 3.9, the ray centers in the distribution having a mean value of the cost function of 1.4 are less overlapped and more densely distributed on virtual image plane 105 than the ray centers in the distribution having a mean value of the cost function of 3.9. Likewise, the ray centers in the distributions having mean values of the cost function of 1.1 and 1.5 are densely distributed on virtual image plane 105, while the ray centers in the distribution having a mean value of the cost function of 4.8 are overlapped.

Thus, it can be visually confirmed from FIGS. 8 to 10 that the lower the mean value of the cost function is, the more densely the ray centers are distributed on virtual image plane 105. That is, as the mean value of the cost function is lower, a ray center is present closer to a pixel to be obtained by reconstruction when an image is reconstructed so that the resolution of a refocused image is increased.

It also can be found from FIG. 7 that the mean values of the cost function are generally low over the refocusing distance from −5 to +5 when the vertical rotation angle of microlens array 103 relative to image sensor 104 is 6.6 degrees as compared with when the vertical rotation angle of microlens array 103 relative to image sensor 104 is 0 degrees.

Thus, the vertical rotation angle of microlens array 103 relative to image sensor 104 improves the resolution of the refocused image in reconstructing an image.

While it is described that a vertical rotation angle of microlens array 103 relative to image sensor 104 is 0 degrees, the vertical rotation angle may be less than or equal to 0.2 degrees in practice, for example, due to limited accuracy in manufacturing. In the exemplary embodiment, however, the vertical rotation angle of microlens array 103 relative to image sensor 104 is intended to be a rotation angle greater than or equal to about 1 degree regardless of the limited accuracy in manufacturing.

Furthermore, an optimum rotation angle for microlens array 103 in another configuration also has been found. FIG. 11 shows mean values of the cost function for rotation angles of another microlens array relative to the image sensor according to the first exemplary embodiment. FIG. 12 shows mean values of the cost function for rotation angles of yet another microlens array relative to the image sensor according to the first exemplary embodiment.

FIG. 11 shows mean values of the cost function for vertical rotation angles of microlens array 103 relative to image sensor 104, where microlens array 103 having a diameter of about 17 pixels is arranged in a honeycomb structure and pixels of image sensor 104 are arranged in a Bayer array. It has been found that mean values of the cost function are local minima when vertical rotation angles of microlens array 103 relative to image sensor 104 are about 6.6 degrees and about 23.4 degrees and the resolution of the refocused image is improved.

FIG. 12 shows mean values of the cost function for vertical rotation angles of microlens array 103 relative to image sensor 104, where microlens array 103 having a diameter of about 16 pixels is arranged in a honeycomb structure and pixels of image sensor 104 are arranged in a Bayer array. It has been found that mean values of the cost function are local minima when vertical rotation angles of microlens array 103 relative to image sensor 104 are about 3.9 degrees, about 13.8 degrees, about 16.2 degrees, and about 26.1 degrees and the resolution of the refocused image are improved.

[1-4. Advantageous Effects]

As described above, the image capture device of the present disclosure, which is capable of recording light information comprised of the traveling direction of light and the intensity for the traveling direction, includes the main lens, the image sensor, the microlens array that is placed between the main lens and the image sensor and has a predetermined vertical rotation angle relative to the image sensor, and the signal processing unit for generating a refocused image on the virtual image plane at any given focal position using the light information.

The image capture device of the present disclosure can improve the image resolution when the refocused image is generated by reconstructing an image accordingly.

The image capture device of the present disclosure is applicable to a light-field camera and, in particular, to light-field cameras for use in a vehicle camera, surveillance camera, digital camera, movie camera, wearable camera, etc.

Claims

1. An image capture device capable of recording light information including a traveling direction of light and intensity (of the light) in the traveling direction, the image capture device comprising:

a main lens;
an image sensor;
a microlens array that is placed between the main lens and the image sensor and has a predetermined vertical rotation angle relative to the image sensor; and
a signal processing unit for generating a refocused image on a virtual image plane at any given focal position using the light information,
wherein the predetermined vertical rotation angle is determined by searching for a local minimum of a cost function that evaluates a distance between a position of a ray center and a position of a pixel that constructs the refocused image, where the position of the ray center is a point at which a light ray projected from the image sensor onto the virtual image plane in a direction of the light ray intersects the virtual image plane.

2. The image capture device according to claim 1,

wherein the predetermined vertical rotation angle is greater than or equal to about 1.0 degrees.

3. The image capture device according to claim 1,

wherein the any given focal position is in a range from −5 to +5 in a ratio relative to a distance between the microlens array and the image sensor.
Patent History
Publication number: 20160309074
Type: Application
Filed: Jun 28, 2016
Publication Date: Oct 20, 2016
Inventor: Tomohide ISHIGAMI (Osaka)
Application Number: 15/194,694
Classifications
International Classification: H04N 5/232 (20060101); H04N 5/225 (20060101); G02B 27/10 (20060101); H04N 5/369 (20060101);