Method, apparatus and system providing an integrated hyperspectral imager
Methods, apparatuses, and systems are disclosed which provide a plurality of pixel arrays on a common substrate each associated with a hyperspectral filter. Images from each of the arrays may be separately analyzed or combined into a composite image.
Latest Patents:
- EXTREME TEMPERATURE DIRECT AIR CAPTURE SOLVENT
- METAL ORGANIC RESINS WITH PROTONATED AND AMINE-FUNCTIONALIZED ORGANIC MOLECULAR LINKERS
- POLYMETHYLSILOXANE POLYHYDRATE HAVING SUPRAMOLECULAR PROPERTIES OF A MOLECULAR CAPSULE, METHOD FOR ITS PRODUCTION, AND SORBENT CONTAINING THEREOF
- BIOLOGICAL SENSING APPARATUS
- HIGH-PRESSURE JET IMPACT CHAMBER STRUCTURE AND MULTI-PARALLEL TYPE PULVERIZING COMPONENT
This application is a continuation-in-part of U.S. application Ser. No. 11/367,580 filed Mar. 6, 2006, the disclosure of which is incorporated by reference herein.
FIELD OF THE INVENTIONEmbodiments of the invention relate generally to the field of semiconductor devices and more particularly to multi-array image sensors.
BACKGROUND OF THE INVENTIONImage sensors, such as multispectral image sensors, generally produce images with a few relatively broad wavelength bands from a wavelength of about 400 nm to about 700 nm. These bands typically correlate to the red, green and blue color filters (RGB) of a Bayer patterned color filter array (described below) used in the image sensor. Hyperspectral image sensors, on the other hand, simultaneously collect image data in dozens or hundreds of narrow, adjacent hyperspectral bands. Hyperspectral sensors create a larger number of images from contiguous, rather than disjoint, regions of the spectrum, typically, with much finer resolution than can be obtained with a multispectral image sensor. Hyperspectral imaging involves acquisition of image data in many contiguous narrow hyperspectral bands, the goal being to produce laboratory quality reflectance spectra for each pixel in an image.
The semiconductor industry currently produces different types of semiconductor-based image sensors, such as charge coupled devices (CCDs), CMOS active pixel sensors (APS), photodiode arrays, charge injection devices and hybrid focal plane arrays, among others. These image sensors use imaging lenses to focus electromagnetic radiation onto photo-conversion devices, e.g., photodiodes. Also, these image sensors can use color filters to pass particular wavelengths of electromagnetic radiation to the photo-conversion devices, such that the photo-conversion devices typically are associated with a particular color.
Between the color filter array 30 and the pixel cells 10 is a passivation layer 6 which typically covers the gate structure of transistors of the pixels and an overlying interlayer dielectric (ILD) region 3. The ILD region 3 typically includes multiple layers of interlayer dielectrics and conductors that form connections between devices of the pixel cells 10 and from the pixel cells 10 to circuitry 150 peripheral to the array 100. A dielectric layer 5 may also be provided between the color filter array 30 and imaging lenses 20.
As discussed above, a hyperspectral image sensor or camera system relies on many narrow hyperspectral bandpass filters to capture the hyperspectral image content of a scene. The hyperspectral bandpass filters may be applied to an imaging system of the type illustrated in
Therefore, it would be advantageous to have an integrated hyperspectral image sensor which better captures the hyperspectral image content of a scene, which is also compact.
In the following detailed description, reference is made to the accompanying drawings which form a part hereof, and in which are shown by way of illustration of specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those of ordinary skill in the art to make and use them, and it is to be understood that structural, logical or procedural changes may be made to the specific embodiments disclosed herein.
The embodiments described herein relate to methods, apparatuses, and systems for integrating a plurality of pixel arrays onto a single substrate, each having an associated hyperspectral bandpass filter and imaging lens. Each imaging array and its associated lens and bandpass filter form an image of objects in a scene. If the image sensor arrays and lenses are in close proximity, on the order of less than 2 mm, e.g., about 0.5 mm to about 1 mm, then object parallax will be quite small and each array shared will image the same scene for objects that are all farther than 1 m from the arrays. A hyperspectral image may be constructed from individual images respectively acquired by the pixel arrays such that each pixel in the image contains a hyperspectral representation of reflectance of that point in the scene.
Referring to
As noted, the pixel arrays 203 preferably are integrated on a single silicon die substrate with common circuitry. Including a multi-array color image sensor on a single die provides for a reduction of color crosstalk artifacts, especially for compact camera modules with pixel sizes less than 6 microns by 6 microns. Moreover, an imaging lens with a short focal length can minimize parallax effects and allow a camera module employing image sensor 200 to be more compact.
Although
Because the image sensor 200 employs hyperspectral imaging, the image data is simultaneously collected in dozens or hundreds of narrow, adjacent hyperspectral bands, as illustrated in
Each image I1 . . . In from a respective array represents a hyperspectral image from a respective pixel array 203 (
As an object to be imaged moves closer to the array of imaging lenses 204, the individual arrays 203 will exhibit an increase in parallax distance between them. The magnitude of the parallax distance between two adjacent arrays is approximated by the following formula:
Px=(N/x)*dx (1)
where N is the pixel array dimension (in pixels), dx is the distance between the outer focal points of lenses 204 and where x is approximated by the following formula:
x=2*O*Tan(α/2) (2)
where O is the object distance from the camera (or distance from the object to be imaged and the imaging lenses 204) and α is the field of view (FOV) angle. Once the object distance O has been measured or approximated, the parallax distance calculation can be performed. An example for calculating parallax is discussed below. This example also shows that small pixel arrays (on the order of about 1 mm and spaced less than 2 mm apart, e.g., from about 0.5 mm to about 1 mm) will not produce excessive parallax.
For an object distance O of about 2 m from a camera and a field of view angle of 60° , the Px or parallax distance at the object plane between adjacent pixel arrays 203 would equal 0.78 pixels by using formulas (1) and (2) above. Thus, for image sensors with 4×4 imaging arrays (i.e., 4×4 image sensor), the maximum parallax (at 60° FOV) between images for objects with a distance of greater than or equal to 2 m from the camera would equal 2.35 pixels. A 4×4 image sensor would have 16 hyperspectral bands, a pixel element size of 2.2 μm×2.2 μm and a target hyperspectral image size of 800×600.
Various image sensor configurations allow tradeoffs in performance. For example, a 3×3 image sensor will have less maximum parallax than a 4×4 image sensor, but will have only 9 hyperspectral bands. A larger FOV will have less parallax but the user must get closer to objects they want to isolate in a scene. Further, smaller pixels may be used to reduce maximum parallax with a tradeoff of a reduction in camera sensitivity.
The CMOS image sensor 500 is operated by a control circuit 530, which controls address decoders 515, 525 for selecting the appropriate row and column select lines for pixel readout. Control circuit 530 also controls the row and column driver circuitry 510, 520 so that they apply driving voltages to the drive transistors of the selected row and column select lines. The pixel output signals typically include a pixel reset signal Vrst read out of the storage region after it is reset by the reset transistor and a pixel image signal Vsig, which is read out of the storage region after photo-generated charges are transferred to the region. The Vrst and Vsig signals are sampled by a sample and hold circuit 535 and are subtracted by a differential amplifier 540, to produce a differential signal Vrst-Vsig for each readout pixel. Vrst-Vsig represents the amount of light impinging on the pixels. This difference signal is digitized by an analog-to-digital converter 545. The digitized pixel signals are fed to an image processor 550 to form a digital image output. The digitizing and image processing can be located on or off the imaging device chip. In some arrangements the differential signal Vrst-Vsig can be amplified as a differential signal and directly digitized by a differential analog to digital converter. The analog-to digital converter 545 supplies the digitized pixel signals to an image processor 550, which performs appropriate image processing, which can include combining the outputs of multiple arrays and performing a parallax adjustment calculation if needed or desired, before providing digital signals defining an image output.
It should be noted that
The processor system 900, for example a digital camera system, generally comprises a central processing unit (CPU) 995, such as a microprocessor for common operational control, that communicates with an input/output (I/O) device 991 over a bus 993. Image sensor 500 also communicates with the CPU 995 over the bus 993. The processor-based system 900 also includes random access memory (RAM) 992, and can include removable memory 994, such as flash memory, which also communicate with CPU 995 over the bus 993. Image sensor 500 may be combined with a processor, such as a CPU, digital signal processor, or microprocessor, with or without memory storage on a single integrated circuit or on a different chip than the processor. A parallax adjustment calculation may be performed by the image processor 550 in image sensor 500, or by the CPU 995.
While the embodiments have been described in detail in connection with the embodiments known at the time, it should be readily understood that the claimed invention is not limited to the disclosed embodiments. Rather, they can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described. For example, while embodiments are described in connection with a CMOS pixel image sensor, they can be practiced with any other type of pixel image sensor (e.g., CCD, etc.). Furthermore, the various embodiments could be used in automotive applications and other applications where the object plane is at a constant distance, parallax is easily accounted for by using a simple linear shift of pixel data from each camera to properly register all images, such as in machine vision or industrial imaging. In addition, various embodiments may be used in low-cost, solid-state hyperspectral scanners, in which multiple hyperspectral cameras in one scanning system may produce high-resolution images very quickly.
Claims
1. An imager apparatus comprising:
- a plurality of pixel arrays on a single die; and
- a plurality of imaging lenses for focusing an image on the pixel arrays, each of the pixel arrays being associated with a respective hyperspectral bandpass filter.
2. The imager apparatus of claim 1, wherein each of the hyperspectral filters has a spectral bandpass of less than 100 nm.
3. The imager apparatus of claim 2, wherein each of the hyperspectral filters has a spectral bandpass of between 50 nm and 100 nm.
4. The imager apparatus of claim 2, wherein each of the hyperspectral filters has a spectral bandpass of less than 10 nm.
5. The imager apparatus of claim 4, wherein each of the hyperspectral filters has a spectral bandpass of between 5 nm and 10 nm.
6. The imager apparatus of claim 1, wherein each hyperspectral bandpass filter is integrated with its associated imaging lens.
7. The imager apparatus of claim 6, wherein each hyperspectral bandpass filter is a thin film coating on its associated lens.
8. The imager apparatus of claim 1, wherein each hyperspectral bandpass filter is a separate element from its associated imaging lens.
9. The imager apparatus of claim 1, wherein each hyperspectral bandpass filter is a thin film coating on a surface of its associated pixel array.
10. The imager apparatus of claim 1, wherein each hyperspectral bandpass filter has a unique bandpass range.
11. The imager apparatus of claim 1, wherein the hyperspectral bandpass filters collectively cover a spectrum of between about 700 nm and about 1000 nm.
12. The imager apparatus of claim 1, wherein the hyperspectral bandpass filters collectively cover a spectrum of between about 200 nm and about 400 nm.
13. The imager apparatus of claim 1, wherein the plurality of pixel arrays comprise M×N pixel arrays arranged in a M×N pattern of arrays on the die.
14. The imager apparatus of claim 13, wherein M equals N.
15. The imager apparatus of claim 13, wherein M does not equal N.
16. The imager apparatus of claim 1, wherein the pixel arrays are spaced less than 2 mm apart.
17. The imager apparatus of claim 16, wherein the pixel arrays are spaced about 0.5 mm to about 1 mm apart.
18. The imager apparatus of claim 16, wherein the imaging lenses and associated hyperspectral bandpass filters are configured to capture a full image of a scene.
19. The imager apparatus of claim 1, further comprising an optical barrier between adjacent pixel arrays.
20. The imager apparatus of claim 1, further comprising an optical barrier between adjacent imaging lenses.
21. An imager apparatus comprising:
- a plurality of pixel arrays formed on a single die, wherein the plurality of arrays are spaced apart by less than 2 millimeters;
- a plurality of hyperspectral bandpass filters respectively associated with the pixel arrays, each of the hyperspectral filters having a spectral bandpass of less than 100 nm; and
- a plurality of imaging lenses respectively associated with the pixel arrays.
22. The imager apparatus of claim 21, wherein each of the hyperspectral filters has a spectral bandpass of between 50 nm and 100 nm.
23. The imager apparatus of claim 21, wherein each of the hyperspectral filters has a spectral bandpass of less than 10 nm.
24. The imager apparatus of claim 23, wherein each of the hyperspectral filters has a spectral bandpass of between 5 nm and 10 nm.
25. The imager apparatus of claim 21, wherein each imaging lens has a hyperspectral bandpass filter associated with its respective imaging lens.
26. The imager apparatus of claim 25, wherein each hyperspectral bandpass filter is integrated with its associated imaging lens.
27. The imager apparatus of claim 26, wherein each hyperspectral bandpass filter is a thin film coating on its associated lens.
28. The imager apparatus of claim 25, wherein each hyperspectral bandpass filter is a separate element from its associated imaging lens.
29. The imager apparatus of claim 21, wherein each hyperspectral bandpass filter is a thin film coating on a surface of its associated pixel array.
30. The imager apparatus of claim 21, wherein each hyperspectral bandpass filter has a unique bandpass range.
31. The imager apparatus of claim 21, wherein the plurality of pixel arrays comprise M×N pixel arrays arranged in a M×N pattern of arrays on the die.
32. The imager apparatus of claim 31, wherein M equals N.
33. The imager apparatus of claim 31, wherein M does not equal N.
34. The imager apparatus of claim 21, wherein the pixel arrays and the imaging lenses are spaced about 0.5 mm to about 1 mm apart.
35. The imager apparatus of claim 34, wherein the imaging lenses and associated hyperspectral bandpass filters are configured to capture a full image of a scene.
36. An imager apparatus comprising:
- an image sensor comprising:
- a plurality of pixel arrays;
- a plurality of imaging lenses respectively arranged above the pixel arrays; and
- a hyperspectral bandpass filter associated with each imaging lens, wherein each hyperspectral bandpass filter is unique for each imaging lens; and
- a readout circuit for reading out respective image signals from each of the arrays.
37. The imager apparatus of claim 36, further comprising:
- an image combining device for combining the respective image signals from each of the pixel arrays.
38. The imager apparatus of claim 37, wherein the image combining device performs a parallax adjustment calculation.
39. The imager apparatus of claim 37, wherein a common readout circuit is provided for all of the pixel arrays.
40. The imager apparatus of claim 37, wherein separate readout circuits are provided for each of the pixel arrays.
41. The imager apparatus of claim 36, wherein the plurality of pixel arrays are formed on a single die.
42. The imager apparatus of claim 36, wherein each hyperspectral bandpass filter is integrated with its associated imaging lens.
43. The imager apparatus of claim 42, wherein each hyperspectral bandpass filter is a thin film coating on its associated lens.
44. The imager apparatus of claim 36, wherein each hyperspectral bandpass filter is a separate element from its associated imaging lens.
45. The imager apparatus of claim 36, wherein each hyperspectral bandpass filter is a thin film coating on a surface of its associated pixel array.
46. The imager apparatus of claim 36, wherein the pixel arrays are spaced less than 2 mm apart.
47. The imager apparatus of claim 46, wherein the pixel arrays are spaced about 0.5 mm to about 1 mm apart.
48. The imager apparatus of claim 36, wherein the imager apparatus is a component of a digital camera.
Type: Application
Filed: Dec 21, 2006
Publication Date: Sep 6, 2007
Applicant:
Inventor: Scott Smith (Los Angeles, CA)
Application Number: 11/642,867