ASYMMETRIC ANGULAR RESPONSE PIXELS FOR SINGL SENSOR STEREO
Depth sensing imaging pixels include pairs of left and right pixels forming an asymmetrical angular response to incident light. A single microlens is positioned above each pair of left and right pixels. Each microlens spans across each of the pairs of pixels in a horizontal direction. Each microlens has a length that is substantially twice the length of either the left or right pixel in the horizontal direction; and each microlens has a width that is substantially the same as a width of either the left or right pixel in a vertical direction. The horizontal and vertical directions are horizontal and vertical directions of a planar image array. A light pipe in each pixel is used to improve light concentration and reduce cross talk.
Latest SEMICONDUCTOR COMPONENTS INDUSTRIES, LLC Patents:
- Methods and systems to improve uniformity in power FET arrays
- Fan-out wafer level packaging of semiconductor devices
- THERMAL PERFORMANCE IMPROVEMENT AND STRESS REDUCTION IN SEMICONDUCTOR DEVICE MODULES
- POWER TRANSISTORS WITH RESONANT CLAMPING CIRCUITS
- BUILT-IN SELF TEST WITH CURRENT MEASUREMENT FOR ANALOG CIRCUIT VERIFICATION
This application is a continuation of patent application Ser. No. 13/404,319, filed Feb. 24, 2012, which claims the benefit of provisional patent application No. 61/522,876, filed Aug. 12, 2011 which are hereby incorporated by reference herein in their entireties. This application claims the benefit of and claims priority to patent application Ser. No. 13/404,319, filed Feb. 24, 2012, and provisional patent application No. 61/522,876, filed Aug. 12, 2011.
FIELD OF THE INVENTIONThe present invention relates, in general, to imaging systems. More specifically, the present invention relates to imaging systems with depth sensing capabilities and stereo perception, although using only a single sensor with a single lens.
BACKGROUND OF THE INVENTIONModern electronic devices such as cellular telephones, cameras, and computers often use digital image sensors. Imagers (i.e., image sensors) may be formed from a two-dimensional array of image sensing pixels. Each pixel receives incident photons (light) and converts the photons into electrical signals.
Some applications, such as three-dimensional (3D) imaging may require electronic devices to have depth sensing capabilities. For example, to properly generate a 3D image for a given scene, an electronic device may need to identify the distances between the electronic device and objects in the scene. To identify distances, conventional electronic devices use complex arrangements. Some arrangements require the use of multiple cameras with multiple image sensors and lenses that capture images from various viewpoints. These increase cost and complexity in obtaining good stereo imaging performance. Other arrangements require the addition of lenticular arrays that focus incident light on sub-regions of a two-dimensional pixel array. Due to the addition of components, such as complex lens arrays, these arrangements lead to reduced spatial resolution, increased cost and complexity.
The present invention, as will be explained, addresses an improved imager that obtains stereo performance using a single sensor with a single lens. Such imager reduces complexity and cost, and improves the stereo imaging performance.
The invention may be best understood from the following detailed description when read in connection with the accompanying figures:
An electronic device with a digital camera module is shown in
Still and video image data from camera sensor 14 may be provided to image processing and data formatting circuitry 16 via path 26. Image processing and data formatting circuitry 16 may be used to perform image processing functions such as data formatting, adjusting white balance and exposure, implementing video image stabilization, face detection, etc. Image processing and data formatting circuitry 16 may also be used to compress raw camera image files, if desired (e.g., to Joint Photographic Experts Group, or JPEG format). In a typical arrangement, which is sometimes referred to as a system-on-chip, or SOC arrangement, camera sensor 14 and image processing and data formatting circuitry 16 are implemented on a common integrated circuit. The use of a single integrated circuit to implement camera sensor 14 and image processing and data formatting circuitry 16 may help to minimize costs.
Camera module 12 (e.g., image processing and data formatting circuitry 16) conveys acquired image data to host subsystem 20 over path 18. Electronic device 10 typically provides a user with numerous high-level functions. In a computer or advanced cellular telephone, for example, a user may be provided with the ability to run user applications. To implement these functions, host subsystem 20 of electronic device 10 may have input-output devices 22, such as keypads, input-output ports, joysticks, displays, and storage and processing circuitry 24. Storage and processing circuitry 24 may include volatile and nonvolatile memory (e.g., random-access memory, flash memory, hard drives, solid state drives, etc.). Storage and processing circuitry 24 may also include microprocessors, microcontrollers, digital signal processors, application specific integrated circuits, etc.
It may be desirable to form image sensors with depth sensing capabilities (e.g., for use in 3D imaging applications, such as machine vision applications and other three dimensional imaging applications). To provide depth sensing capabilities, camera sensor 14 may include pixels such as pixels 100A, and 100B, shown in
Microlens 102 may direct incident light towards a substrate area between pixel separators 112. Color filter 104 may filter the incident light by only allowing predetermined wavelengths to pass through color filter 104 (e.g., color filter 104 may only be transparent to wavelengths corresponding to a green color). Photo-sensitive areas 110A and 110B may serve to absorb incident light focused by microlens 102 and produce image signals that correspond to the amount of incident light absorbed.
A pair of pixels 100A and 100B may be covered by one microlens 102. Thus, the pair of pixels may be provided with an asymmetric angular response (e.g., pixels 100A and 100B may produce different image signals based on the angle at which incident light reaches pixels 100A and 100B). The angle at which incident light reaches pixels 100A and 100B may be referred to herein as an incident angle, or angle of incidence.
In the example of
In the example of
Due to the special formation of the microlens, pixels 100A and 100B may have an asymmetric angular response (e.g., pixel 100A and 100B may produce different signal outputs for incident light with a given intensity, based on an angle of incidence). In the diagram of
Incident light 113 that reaches pair of pixels 100A and 100B may have an angle of incidence that is approximately equal for both pixels. In the arrangement of
The respective output image signals for pixel pair 200 (e.g., pixels 100A and 100B) are shown in
Line 164 of
Pixel pairs 200 may be used to form imagers with depth sensing capabilities.
In the arrangement of
In the arrangement of
In the arrangement of
The arrangements of
The output image signals of each pixel pair 200 of image sensor 14 may depend on the distance from camera lens 202 to object 204. The angle at which incident light reaches pixel pairs 200 of image sensor 14 depends on the distance between lens 202 and objects in a given scene (e.g., the distance between objects such as object 204 and device 10).
An image depth signal may be calculated from the difference between the two output image signals of each pixel pair 200. The diagram of
For distances greater than D4 and less than D3, the image depth signal may remain constant. Pixels 100A and 100B may be unable to resolve incident angles with magnitudes larger than the magnitudes of angles provided by objects at distances greater than D4, or at distances less than D3. In other words, a depth sensing imager may be unable to accurately measure depth information for objects at distances greater than D4, or at distances less than D3. The depth sensing imager may be unable to distinguish whether an object is at a distance D4 or a distance D5 (as an example). If desired, the depth sensing imager may assume that all objects that result in an image depth signal equivalent to distance D2 or D4 are at a distance of D2 or D4, respectively.
To provide an imager 14 with depth sensing capabilities, two dimensional pixel arrays 201 may be formed from various combinations of depth sensing pixel pairs 200 and regular pixels (e.g., pixels without asymmetric angular responses). For a more comprehensive description of two dimensional pixel arrays 201, with depth sensing capabilities and with regular pixels (e.g., pixels without asymmetric angular responses), reference is made to application Ser. No. 13/188,389, filed on Jul. 21, 2011, titled Imagers with Depth Sensing Capabilities, having common inventors. That application is incorporated herein by reference in its entirety.
It should be understood that the depth sensing pixels may be formed with any desirable types of color filters. Depth sensing pixels may be formed with red color filters, blue color filters, green color filters, or color filters that pass other desirable wavelengths of light, such as infrared and ultraviolet light wavelengths. If desired, depth sensing pixels may be formed with color filters that pass multiple wavelengths of light. For example, to increase the amount of light absorbed by a depth sensing pixel, the depth sensing pixel may be formed with a color filter that passes many wavelengths of light. As another example, the depth sensing pixel may be formed without a color filter (sometimes referred to as a clear pixel).
Referring now to
Several pixel pairs 302 are shown in
Referring now to
Disposed between each color filter 312 and each pixel pair 316A and 316B are two light pipes (LPs). Each LP improves the light concentration that impinges upon each respective pixel. The LP improves, not only the light concentration, but also reduces cross-talk and insures good three dimensional performance, even with very small pixel pitches, such as 1.4 microns or less.
As shown on the left side of
It will now be understood that an asymmetric angular response stereo sensor is provided by the present invention. By having a 2×1 CFA pattern, as shown in
For example, the first pixel pair provides a green color; when the pair is separated into left and right images, the present invention provides a single green pixel for the left image and a single green pixel for the right image. Similarly, when the two right pixels providing red colors are separated into left and right images, the present invention forms a left image with a red color and a right image with a red color. Thus, a 2×1 CFA pattern enables the present invention to form a normal Bayer color process for two separate images (left and right Bayer images), as shown in
Referring next to
In arrangement 1 shown in
In arrangement 2, shown in
Referring again to
Although the invention is illustrated and described herein with reference to specific embodiments, the invention is not intended to be limited to the details shown. Rather, various modifications may be made in the details within the scope and range of equivalents of the claims and without departing from the invention.
Claims
1. An image sensor comprising:
- a pixel pair that includes a first pixel and a second pixel, wherein the first pixel and the second pixel have asymmetrical angular responses to incident light and wherein the first and second pixels are covered by color filter element material of a single color;
- a microlens that spans the pixel pair, wherein the color filter element material is interposed between the microlens and the first and second pixels; and
- image processing circuitry configured to: obtain an image depth signal by using subtraction to determine a difference between an output signal from the first pixel of the pixel pair and an output signal from the second pixel of the pixel pair; and determine a distance to an imaged object based on the image depth signal.
2. The image sensor defined in claim 1, further comprising:
- additional pixels having symmetrical angular responses to the incident light.
3. The image sensor defined in claim 1, wherein the first and second pixels are positioned in the same row of pixels.
4. The image sensor defined in claim 1, wherein the microlens has a width and a length that is longer than the width.
5. The image sensor defined in claim 1, wherein the microlens has a width and a length that is substantially twice as long as the width.
6. The image sensor defined in claim 1, wherein the microlens covers only the first and second pixels.
7. The image sensor defined in claim 1, further comprising:
- a first light pipe formed over the first pixel; and
- a second light pipe formed over the second pixel.
8. An image sensor comprising:
- first and second adjacent photosensitive areas having asymmetrical angular responses to incident light, wherein the first and second photosensitive areas are covered by color filter element material of a single color;
- a microlens that covers the first and second photosensitive areas, wherein the microlens has a width and a length that is longer than the width; and
- image processing circuitry configured to obtain an image depth signal by using subtraction to determine a difference between an output signal from the first photosensitive area and an output signal from the second photosensitive area.
9. The image sensor defined in claim 8, further comprising:
- additional photosensitive areas having symmetrical angular responses to the incident light.
10. The image sensor defined in claim 8, wherein the first and second photosensitive areas are positioned in the same row of photosensitive areas.
11. The image sensor defined in claim 8, wherein the length is substantially twice as long as the width.
12. The image sensor defined in claim 8, wherein the microlens covers only the first and second photosensitive areas.
13. The image sensor defined in claim 8, wherein the image processing circuitry is further configured to:
- determine a distance to an imaged object based on the image depth signal.
14. The image sensor defined in claim 8, further comprising:
- a first light pipe formed over the first photosensitive area; and
- a second light pipe formed over the second photosensitive area.
15. An image sensor comprising:
- a pixel pair that includes a first pixel and a second pixel, wherein the first pixel and the second pixel have asymmetrical angular responses to incident light;
- at least one color filter element of a single color, wherein the at least one color filter element covers both the first and second pixels in the pixel pair;
- a microlens that spans the pixel pair, wherein the microlens is formed over the at least one color filter element; and
- image processing circuitry configured to: obtain an image depth signal by using subtraction to determine a difference between an output signal from the first pixel of the pixel pair and an output signal from the second pixel of the pixel pair; and determine a distance to an imaged object based on the image depth signal.
16. The image sensor defined in claim 15, further comprising:
- additional pixels having symmetrical angular responses to the incident light.
17. The image sensor defined in claim 15, wherein the first and second pixels are positioned in the same row of pixels.
18. The image sensor defined in claim 15, wherein the microlens has a width and a length that is longer than the width.
19. The image sensor defined in claim 15, wherein the microlens has a width and a length that is substantially twice as long as the width.
20. The image sensor defined in claim 15, wherein the microlens covers only the first and second pixels.
Type: Application
Filed: Jun 5, 2018
Publication Date: Oct 4, 2018
Applicant: SEMICONDUCTOR COMPONENTS INDUSTRIES, LLC (Phoenix, AZ)
Inventors: Gennadiy AGRANOV (San Jose, CA), Dongqing CAO (San Jose, CA), Avi YARON (Palo Alto, CA)
Application Number: 15/997,851