Selective Region-Based Focus With Focal Adjustment Bracketing Via Lens / Image Sensor Position Manipulation
An image capturing apparatus constructs a composite image using image regions of images captured with differing focal distances between an image plane of an image capturing apparatus photo-detector image sensor and a subject of the image, providing selective focus, background de-focus, and/or lens blur modes of the image capturing apparatus. Construction of the composite image occurs subsequent to aligning images captured with different focal lengths, the aligning partly based on registration of one or more visual features common to one or more images of the plurality of images utilized in providing an image with the desired focus responsive to a selection via a user interface of the image capturing apparatus. Focus bracketing refers to collecting multiple images of the same scene or object while adjusting the image capturing apparatus's focal parameters between collecting the images to focus at distances both nearer and more distant from a desired focus.
If an Application Data Sheet (ADS) has been filed on the filing date of this application, it is incorporated by reference herein. Any applications claimed on the ADS for priority under 35 U.S.C. §§119, 120, 121 or 365(c), and any and all parent, grandparent, great-grandparent, etc. applications of such applications, are also incorporated by reference, including any priority claims made in those applications and any material incorporated by reference, to the extent such subject matter is not inconsistent herewith.
CROSS-REFERENCE TO RELATED APPLICATIONSThe present application is related to and/or claims the benefit of the earliest available effective filing date(s) from the following listed application(s) (the “Priority Applications”), if any, listed below (e.g., claims earliest available priority dates for other than provisional patent applications or claims benefits under 35 USC §119(e) for provisional patent applications, for any and all parent, grandparent, great-grandparent, etc. applications of the Priority Application(s)). In addition, the present application is related to the “Related Application(s),” if any, listed below.
PRIORITY APPLICATIONSFor purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation of U.S. patent application Ser. No. 14/108,003, entitled IMAGE CORRECTION USING INDIVIDUAL MANIPULATION OF MICROLENSES IN A MICROLENS ARRAY, naming W. Daniel Hillis, Nathan P. Myhrvold, and Lowell L. Wood Jr. as inventors, filed 16 Dec. 2013, which is currently co-pending or is an application of which a currently co-pending application is entitled to the benefit of the filing date;
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation of U.S. patent application Ser. No. 12/925,848, entitled IMAGE CORRECTION USING INDIVIDUAL MANIPULATION OF MICROLENSES IN A MICROLENS ARRAY, naming W. Daniel Hillis, Nathan P. Myhrvold, and Lowell L. Wood Jr. as inventors, filed 28 Oct. 2010, now issued as U.S. Pat. No. 8,643,955 on 4 Feb. 2014, which is currently co-pending or is an application of which a currently co-pending application is entitled to the benefit of the filing date;
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation of U.S. patent application Ser. No. 12/072,497, entitled IMAGE CORRECTION USING INDIVIDUAL MANIPULATION OF MICROLENSES IN A MICROLENS ARRAY, naming W. Daniel Hillis, Nathan P. Myhrvold, and Lowell L. Wood Jr. as inventors, filed 25 Feb. 2008, now issued as U.S. Pat. No. 7,826,139 on 2 Nov. 2010, and which is an application of which a currently co-pending application is entitled to the benefit of the filing date;
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation of U.S. patent application Ser. No. 11/811,356, entitled IMAGE CORRECTION USING A MICROLENS ARRAY AS A UNIT, naming W. Daniel Hillis, Nathan P. Myhrvold, and Lowell L. Wood Jr. as inventors, filed 7 Jun. 2007, now issued as U.S. Pat. No. 7,742,233 on 22 Jun. 2010, and which is an application of which a currently co-pending application is entitled to the benefit of the filing date;
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation of U.S. patent application Ser. No. 11/804,314, entitled LENS DEFECT CORRECTION, naming W. Daniel Hillis, Nathan P. Myhrvold, and Lowell L. Wood Jr. as inventors, filed 15 May 2007, which is abandoned, and which is an application of which a currently co-pending application is entitled to the benefit of the filing date;
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation of U.S. patent application Ser. No. 11/498,427, entitled IMAGE CORRECTION USING A MICROLENS ARRAY AS A UNIT, naming W. Daniel Hillis, Nathan P. Myhrvold, and Lowell L. Wood Jr. as inventors, filed 2 Aug. 2006, now issued as U.S. Pat. No. 7,259,917 on 21 Aug. 2007, and which is an application of which a currently co-pending application is entitled to the benefit of the filing date;
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation of U.S. patent application Ser. No. 11/221,350, entitled IMAGE CORRECTION USING INDIVIDUAL MANIPULATION OF MICROLENSES IN A MICROLENS ARRAY, naming W. Daniel Hillis, Nathan P. Myhrvold, and Lowell L. Wood Jr. as inventors, filed 7 Sep. 2005, now issued as U.S. Pat. No. 7,417,797 on 26 Aug. 2008, and which is an application of which a currently co-pending application is entitled to the benefit of the filing date;
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation of U.S. patent application Ser. No. 10/764,431, entitled IMAGE CORRECTION USING INDIVIDUAL MANIPULATION OF MICROLENSES IN A MICROLENS ARRAY, naming W. Daniel Hillis, Nathan P. Myhrvold, and Lowell L. Wood Jr. as inventors, filed 21 Jan. 2004, now issued as U.S. Pat. No. 6,967,780 on 22 Nov. 2005, and which is an application of which a currently co-pending application is entitled to the benefit of the filing date;
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation of U.S. patent application Ser. No. 10/764,340, entitled IMAGE CORRECTION USING A MICROLENS ARRAY AS A UNIT, naming W. Daniel Hillis, Nathan P. Myhrvold, and Lowell L. Wood Jr. as inventors, filed 21 Jan. 2004, now issued as U.S. Pat. No. 7,251,078 on 31 Jul. 2007, and which is an application of which a currently co-pending application is entitled to the benefit of the filing date; and
For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation of U.S. patent application Ser. No. 10/738,626, entitled LENS DEFECT CORRECTION, naming W. Daniel Hillis, Nathan P. Myhrvold, and Lowell L. Wood Jr. as inventors, filed 16 Dec. 2003, now issued as U.S. Pat. No. 7,231,097 on 12 Jun. 2007, and which is an application of which a currently co-pending application is entitled to the benefit of the filing date.
RELATED APPLICATIONSNone.
The United States Patent Office (USPTO) has published a notice to the effect that the USPTO's computer programs require that patent applicants both reference a serial number and indicate whether an application is a continuation, continuation-in-part, or divisional of a parent application. Stephen G. Kunin, Benefit of Prior Filed Application, USPTO Official Gazette Mar. 18, 2003. The USPTO further has provided forms for the Application Data Sheet which allow automatic loading of bibliographic data but which require identification of each application as a continuation, continuation-in-part, or divisional of a parent application. The present Applicant Entity (hereinafter “Applicant”) has provided above a specific reference to the application(s) from which priority is being claimed as recited by statute. Applicant understands that the statute is unambiguous in its specific reference language and does not require either a serial number or any characterization, such as “continuation” or “continuation-in-part,” for claiming priority to U.S. patent applications. Notwithstanding the foregoing, Applicant understands that the USPTO's computer programs have certain data entry requirements, and hence Applicant has provided designation(s) of a relationship between the present application and its parent application(s) as set forth above and in any ADS filed in this application, but expressly points out that such designation(s) are not to be construed in any way as any type of commentary and/or admission as to whether or not the present application contains any new matter in addition to the matter of its parent application(s).
If the listings of applications provided above are inconsistent with the listings provided via an ADS, it is the intent of the Applicant to claim priority to each application that appears in the Priority Applications section of the ADS and to each application that appears in the Priority Applications section of this application.
All subject matter of the Priority Applications and the Related Applications and of any and all parent, grandparent, great-grandparent, etc. applications of the Priority Applications and the Related Applications, including any priority claims, is incorporated herein by reference to the extent such subject matter is not inconsistent herewith.
TECHNICAL FIELDThe present application relates, in general, to imaging.
SUMMARYIn one aspect, a method includes but is not limited to: capturing a primary image with a microlens array at a primary position, the microlens array having at least one microlens deviation that exceeds a first tolerance from a target optical property; determining at least one out-of-focus region of the primary image; capturing another image with at least one microlens of the microlens array at another position; determining a focus of at least one region of the other image relative to a focus of the at least one out-of-focus region of the primary image; and constructing a composite image in response to the at least one region of the other image having a sharper focus relative to the focus of the at least one out-of-focus region of the primary image. In addition to the foregoing, other method embodiments are described in the claims, drawings, and text forming a part of the present application. In addition to the foregoing, other method aspects are described in the claims, drawings, and text forming a part of the present application.
In one or more various aspects, related systems include but are not limited to machinery and/or circuitry and/or programming for effecting the herein referenced method aspects; the machinery and/or circuitry and/or programming can be virtually any combination of hardware, software, and/or firmware configured to effect the foregoing referenced method aspects depending upon the design choices of the system designer.
In one aspect, a system includes but is not limited to: a photo-detector array; a microlens array having at least one microlens deviation that exceeds a first tolerance from a target optical property; a controller configured to position at least one microlens of the microlens array at a primary and another position relative to the photo-detector array and to cause an image capture signal at the primary and the other position; and an image construction unit configured to construct at least one out-of-focus region of a first image captured at the primary position with a more in-focus region of another image captured at the other position. In addition to the foregoing, other system aspects are described in the claims, drawings, and text forming a part of the present application.
In one aspect, a system includes but is not limited to: a microlens array having at least one microlens deviation that exceeds a first tolerance from a target optical property; an electro-mechanical system configurable to capture a primary image with at least one microlens of the microlens array at a primary position said electro-mechanical system including at least one of electrical circuitry operably coupled with a transducer, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry having a general purpose computing device configured by a computer program, electrical circuitry having a memory device, and electrical circuitry having a communications device; an electro-mechanical system configurable to capture another image with the at last one microlens of the microlens array at another position said electro-mechanical system including at least one of electrical circuitry operably coupled with a transducer, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry having a general purpose computing device configured by a computer program, electrical circuitry having a memory device, and electrical circuitry having a communications device; an electro-mechanical system configurable to determine at least one out-of-focus region of the primary image said electro-mechanical system including at least one of electrical circuitry operably coupled with a transducer, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry having a general purpose computing device configured by a computer program, electrical circuitry having a memory device, and electrical circuitry having a communications device; an electro-mechanical system configurable to determine a focus of at least one region of the other image relative to a focus of the at least one out-of-focus region of the primary image said electro-mechanical system including at least one of electrical circuitry operably coupled with a transducer, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry having a general purpose computing device configured by a computer program, electrical circuitry having a memory device, and electrical circuitry having a communications device; an electro-mechanical system configurable to determine a focus of at least one region of the other image relative to a focus of the at least one out-of-focus region of the primary image said electro-mechanical system including at least one of electrical circuitry operably coupled with a transducer, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry having a general purpose computing device configured by a computer program, electrical circuitry having a memory device, and electrical circuitry having a communications device; and an electro-mechanical system configurable to construct a composite image in response to the at least one region of the other image having a sharper focus relative to the focus of the at least one out-of-focus region of the primary image said electro-mechanical system including at least one of electrical circuitry operably coupled with a transducer, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry having a general purpose computing device configured by a computer program, electrical circuitry having a memory device, and electrical circuitry having a communications device. In addition to the foregoing, other system aspects are described in the claims, drawings, and text forming a part of the present application.
In one aspect, a method includes but is not limited to: capturing a primary image with a microlens array at a primary position, said capturing effected with a photo-detector array having an imaging surface deviation that exceeds a first tolerance from a target surface position; determining at least one out-of-focus region of the primary image; capturing another image with at least one microlens of the microlens array at another position; determining a focus of at least one region of the other image relative to a focus of the at least one out-of-focus region of the primary image; and constructing a composite image in response to the at least one region of the other image having a sharper focus relative to the focus of the at least one out-of-focus region of the primary image. In addition to the foregoing, other method aspects are described in the claims, drawings, and text forming a part of the present application.
In one embodiment, a method includes but is not limited to: capturing a primary image with a lens at a primary position, the lens having at least one deviation that exceeds a first tolerance from a target optical property; capturing another image with the lens at another position; determining at least one out-of-focus region of the primary image; determining a focus of at least one region of the other image relative to a focus of the at least one out-of-focus region of the primary image; and constructing a composite image in response to the at least one region of the other image having a sharper focus relative to the focus of the at least one out-of-focus region of the primary image. In addition to the foregoing, various other method embodiments are set forth and described in the text (e.g., claims and/or detailed description) and/or drawings of the present application.
In one or more various embodiments, related systems include but are not limited to electro-mechanical systems (e.g., motors, actuators, circuitry, and/or programming) for effecting the herein referenced method embodiments); the electrical circuitry can be virtually any combination of hardware, software, and/or firmware configured to effect the foregoing referenced method embodiments depending upon the design choices of the system designer.
In one embodiment, a system includes but is not limited to: a photo-detector array; a lens having at least one deviation that exceeds a first tolerance from a target optical property; a controller configured to position said lens at a primary and another position relative to said photo-detector array and to cause an image capture signal at the primary and the other position; and an image construction unit configured to construct at least one out-of-focus region of a first image captured at the primary position with a more in-focus region of another image captured at the other position.
In one aspect, a method includes but is not limited to: capturing a primary image with a microlens array at a primary position, the microlens array having at least one microlens deviation that exceeds a first tolerance from a target optical property; determining at least one out-of-focus region of the primary image; capturing another image with the microlens array at another position; determining a focus of at least one region of the other image relative to a focus of the at least one out-of-focus region of the primary image; and constructing a composite image in response to the at least one region of the other image having a sharper focus relative to the focus of the at least one out-of-focus region of the primary image. In addition to the foregoing, other method aspects are described in the claims, drawings, and text forming a part of the present application.
In one or more various aspects, related systems include but are not limited to machinery and/or circuitry and/or programming for effecting the herein referenced method aspects; the machinery and/or circuitry and/or programming can be virtually any combination of hardware, software, and/or firmware configured to effect the foregoing referenced method aspects depending upon the design choices of the system designer.
In one aspect, a system includes but is not limited to: a microlens array having at least one microlens deviation that exceeds a first tolerance from a target optical property; means for capturing a primary image with a lens at a primary position; means for determining at least one out-of-focus region of the primary image; means for capturing another image with the lens at another position; means for determining a focus of at least one region of the other image relative to a focus of the at least one out-of-focus region of the primary image; and means for constructing a composite image in response to the at least one region of the other image having a sharper focus relative to the focus of the at least one out-of-focus region of the primary image. In addition to the foregoing, other system aspects are described in the claims, drawings, and text forming a part of the present application.
In one aspect, a system includes but is not limited to: a microlens array having at least one microlens deviation that exceeds a first tolerance from a target optical property; an electro-mechanical system configurable to capture a primary image with the microlens array at a primary position said electro-mechanical system including at least one of electrical circuitry operably coupled with a transducer, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry having a general purpose computing device configured by a computer program, electrical circuitry having a memory device, and electrical circuitry having a communications device; an electro-mechanical system configurable to capture another image with the microlens array at another position said electro-mechanical system including at least one of electrical circuitry operably coupled with a transducer, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry having a general purpose computing device configured by a computer program, electrical circuitry having a memory device, and electrical circuitry having a communications device; an electro-mechanical system configurable to determine at least one out-of-focus region of the primary image said electro-mechanical system including at least one of electrical circuitry operably coupled with a transducer, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry having a general purpose computing device configured by a computer program, electrical circuitry having a memory device, and electrical circuitry having a communications device; an electro-mechanical system configurable to determine a focus of at least one region of the other image relative to a focus of the at least one out-of-focus region of the primary image said electro-mechanical system including at least one of electrical circuitry operably coupled with a transducer, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry having a general purpose computing device configured by a computer program, electrical circuitry having a memory device, and electrical circuitry having a communications device; an electro-mechanical system configurable to determine a focus of at least one region of the other image relative to a focus of the at least one out-of-focus region of the primary image said electro-mechanical system including at least one of electrical circuitry operably coupled with a transducer, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry having a general purpose computing device configured by a computer program, electrical circuitry having a memory device, and electrical circuitry having a communications device; and an electro-mechanical system configurable to construct a composite image in response to the at least one region of the other image having a sharper focus relative to the focus of the at least one out-of-focus region of the primary image said electro-mechanical system including at least one of electrical circuitry operably coupled with a transducer, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry having a general purpose computing device configured by a computer program, electrical circuitry having a memory device, and electrical circuitry having a communications device. In addition to the foregoing, other system aspects are described in the claims, drawings, and text forming a part of the present application.
In one aspect, a method includes but is not limited to: capturing a primary image with a microlens array at a primary position, said capturing effected with a photo-detector array having an imaging surface deviation that exceeds a first tolerance from a target surface position; determining at least one out-of-focus region of the primary image; capturing another image with the microlens array at another position; determining a focus of at least one region of the other image relative to a focus of the at least one out-of-focus region of the primary image; and constructing a composite image in response to the at least one region of the other image having a sharper focus relative to the focus of the at least one out-of-focus region of the primary image. In addition to the foregoing, other method aspects are described in the claims, drawings, and text forming a part of the present application.
In addition to the foregoing, various other method and/or system aspects are set forth and described in the text (e.g., claims and/or detailed description) and/or drawings of the present application.
The foregoing is a summary and thus contains, by necessity; simplifications, generalizations and omissions of detail; consequently, those skilled in the art will appreciate that the summary is illustrative only and is NOT intended to be in any way limiting. Other aspects, inventive features, and advantages of the devices and/or processes described herein, as defined solely by the claims, will become apparent in the non-limiting detailed description set forth herein.
The use of the same symbols in different drawings typically indicates similar or identical items.
DETAILED DESCRIPTIONWith reference to the figures, and with reference now to
Referring now to
Continuing to refer to
With reference now to
Referring again to
With reference again to
Referring again to
With reference again to
Referring again to
With reference again to
Referring again to
With reference again to
Referring now to
With reference again to
Referring again to
With reference again to
In yet another implementation, the step of constructing a composite image in response to the at least one region of the other image having a sharper focus relative to the focus of the at least one out-of-focus region of the primary image includes the sub-steps of correlating a feature of the primary image with a feature of the other image; detecting at least one of size, color, and displacement distortion of at least one of the primary image and the other image; correcting the detected at least one of size, color, and displacement distortion of the at least one of the primary image and the other image; and assembling the composite image using the corrected distortion. In yet another implementation, the step of constructing a composite image in response to the at least one region of the other image having a sharper focus relative to the focus of the at least one out-of-focus region of the primary image includes the sub-step of correcting for motion between the primary and the other image.
Referring again to
With reference again to
Referring again to
With reference again to
Method step 316 shows the end of the process.
Returning to method step 312, shown is that in the event that the upper limit on microlens array tolerance of at least one lens of the microlens array has not been met or exceeded, the process proceeds to method step 306 and continues as described herein.
Referring now to
With reference now to
With reference to the figures, and with reference now to
Referring now to
Continuing to refer to
Specifically, controller 208A is depicted as controlling the position of lens 204A of lens system 200A (e.g., via use of a feedback control subsystem). Image capture unit 206A is illustrated as receiving image data from photo-detector 102A and receiving control signals from controller 208A. Image capture unit 206A is shown as transmitting captured image information to focus detection unit 210A. Focus detection unit 210A is depicted as transmitting focus data to image construction unit 212A. Image construction unit 212A is illustrated as transmitting a composite image to image store/display unit 214A.
With reference now to
Referring again to
With reference again to
Referring again to
With reference again to
Referring now to
With reference again to
Referring again to
With reference again to
In yet another implementation, the step of constructing a composite image in response to the at least one region of the other image having a sharper focus relative to the focus of the at least one out-of-focus region of the primary image includes the sub-steps of correlating a feature of the primary image with a feature of the other image; detecting at least one of size, color, and displacement distortion of at least one of the primary image and the other image; correcting the detected at least one of size, color, and displacement distortion of the at least one of the primary image and the other image; and assembling the composite using the corrected distortion. In yet another implementation, the step of constructing a composite image in response to the at least one region of the other image having a sharper focus relative to the focus of the at least one out-of-focus region of the primary image includes the sub-step of correcting for motion between the primary and the other image.
Referring again to
With reference again to
Referring again to
With reference again to
Method step 316A shows the end of the process.
Returning to method step 312A, shown is that in the event that the upper limit on lens tolerance has not been met or exceeded, the process proceeds to method step 306A and continues as described herein.
Referring now to
With reference now to
With reference to the figures, and with reference now to
Referring now to
Continuing to refer to
With reference now to
Referring again to
With reference again to
Referring again to
With reference again to
Referring now to
With reference again to
Referring again to
With reference again to
In yet another implementation, the step of constructing a composite image in response to the at least one region of the other image having a sharper focus relative to the focus of the at least one out-of-focus region of the primary image includes the sub-steps of correlating a feature of the primary image with a feature of the other image; detecting at least one of size, color, and displacement distortion of at least one of the primary image and the other image; correcting the detected at least one of size, color, and displacement distortion of the at least one of the primary image and the other image; and assembling the composite image using the corrected distortion. In yet another implementation, the step of constructing a composite image in response to the at least one region of the other image having a sharper focus relative to the focus of the at least one out-of-focus region of the primary image includes the sub-step of correcting for motion between the primary and the other image.
Referring again to
With reference again to
Referring again to
With reference again to
Method step 316B shows the end of the process.
Returning to method step 312B, shown is that in the event that the upper limit on microlens array tolerance has not been met or exceeded, the process proceeds to method step 306B and continues as described herein.
Referring now to
With reference now to
Those skilled in the art will appreciate that the foregoing specific exemplary processes and/or machines and/or technologies are representative of more general processes and/or machines and/or technologies taught elsewhere herein, such as in the claims filed herewith and/or elsewhere in the present application.
Those having skill in the art will recognize that the state of the art has progressed to the point where there is little distinction left between hardware, software, and/or firmware implementations of aspects of systems; the use of hardware, software, and/or firmware is generally (but not always, in that in certain contexts the choice between hardware and software can become significant) a design choice representing cost vs. efficiency tradeoffs. Those having skill in the art will appreciate that there are various vehicles by which processes and/or systems and/or other technologies described herein can be effected (e.g., hardware, software, and/or firmware), and that the preferred vehicle will vary with the context in which the processes and/or systems and/or other technologies are deployed. For example, if an implementer determines that speed and accuracy are paramount, the implementer may opt for a mainly hardware and/or firmware vehicle; alternatively, if flexibility is paramount, the implementer may opt for a mainly software implementation; or, yet again alternatively, the implementer may opt for some combination of hardware, software, and/or firmware. Hence, there are several possible vehicles by which the processes and/or devices and/or other technologies described herein may be effected, none of which is inherently superior to the other in that any vehicle to be utilized is a choice dependent upon the context in which the vehicle will be deployed and the specific concerns (e.g., speed, flexibility, or predictability) of the implementer, any of which may vary. Those skilled in the art will recognize that optical aspects of implementations will typically employ optically-oriented hardware, software, and or firmware.
In some implementations described herein, logic and similar implementations may include software or other control structures. Electronic circuitry, for example, may have one or more paths of electrical current constructed and arranged to implement various functions as described herein. In some implementations, one or more media may be configured to bear a device-detectable implementation when such media hold or transmit device detectable instructions operable to perform as described herein. In some variants, for example, implementations may include an update or modification of existing software or firmware, or of gate arrays or programmable hardware, such as by performing a reception of or a transmission of one or more instructions in relation to one or more operations described herein. Alternatively or additionally, in some variants, an implementation may include special-purpose hardware, software, firmware components, and/or general-purpose components executing or otherwise invoking special-purpose components. Specifications or other implementations may be transmitted by one or more instances of tangible transmission media as described herein, optionally by packet transmission or otherwise by passing through distributed media at various times.
Alternatively or additionally, implementations may include executing a special-purpose instruction sequence or invoking circuitry for enabling, triggering, coordinating, requesting, or otherwise causing one or more occurrences of virtually any functional operations described herein. In some variants, operational or other logical descriptions herein may be expressed as source code and compiled or otherwise invoked as an executable instruction sequence. In some contexts, for example, implementations may be provided, in whole or in part, by source code, such as C++, or other code sequences. In other implementations, source or other code implementation, using commercially available and/or techniques in the art, may be compiled//implemented/translated/converted into a high-level descriptor language (e.g., initially implementing described technologies in C or C++ programming language and thereafter converting the programming language implementation into a logic-synthesizable language implementation, a hardware description language implementation, a hardware design simulation implementation, and/or other such similar mode(s) of expression). For example, some or all of a logical expression (e.g., computer programming language implementation) may be manifested as a Verilog-type hardware description (e.g., via Hardware Description Language (HDL) and/or Very High Speed Integrated Circuit Hardware Descriptor Language (VHDL)) or other circuitry model which may then be used to create a physical implementation having hardware (e.g., an Application Specific Integrated Circuit). Those skilled in the art will recognize how to obtain, configure, and optimize suitable transmission or computational elements, material supplies, actuators, or other structures in light of these teachings.
The foregoing detailed description has set forth various embodiments of the devices and/or processes via the use of block diagrams, flowcharts, and/or examples. Insofar as such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. In one embodiment, several portions of the subject matter described herein may be implemented via Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. However, those skilled in the art will recognize that some aspects of the embodiments disclosed herein, in whole or in part, can be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein are capable of being distributed as a program product in a variety of forms, and that an illustrative embodiment of the subject matter described herein applies regardless of the particular type of signal bearing medium used to actually carry out the distribution. Examples of a signal bearing medium include, but are not limited to, the following: a recordable type medium such as a floppy disk, a hard disk drive, a Compact Disc (CD), a Digital Video Disk (DVD), a digital tape, a computer memory, etc.; and a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link (e.g., transmitter, receiver, transmission logic, reception logic, etc.), etc.).
In a general sense, those skilled in the art will recognize that the various aspects described herein which can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, and/or any combination thereof can be viewed as being composed of various types of “electrical circuitry.” Consequently, as used herein “electrical circuitry” includes, but is not limited to, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein), electrical circuitry forming a memory device (e.g., forms of memory (e.g., random access, flash, read only, etc.)), and/or electrical circuitry forming a communications device (e.g., a modem, communications switch, optical-electrical equipment, etc.). Those having skill in the art will recognize that the subject matter described herein may be implemented in an analog or digital fashion or some combination thereof.
Modules, logic, circuitry, hardware and software combinations, firmware, or so forth may be realized or implemented as one or more general-purpose processors, one or more processing cores, one or more special-purpose processors, one or more microprocessors, at least one Application-Specific Integrated Circuit (ASIC), at least one Field Programmable Gate Array (FPGA), at least one digital signal processor (DSP), some combination thereof, or so forth that is executing or is configured to execute instructions, a special-purpose program, an application, software, code, some combination thereof, or so forth as at least one special-purpose computing apparatus or specific computing component. One or more modules, logic, or circuitry, etc. may, by way of example but not limitation, be implemented using one processor or multiple processors that are configured to execute instructions (e.g., sequentially, in parallel, at least partially overlapping in a time-multiplexed fashion, at least partially overlapping across multiple cores, or a combination thereof, etc.) to perform a method or realize a particular computing machine. For example, a first module may be embodied by a given processor executing a first set of instructions at or during a first time, and a second module may be embodied by the same given processor executing a second set of instructions at or during a second time. Moreover, the first and second times may be at least partially interleaved or overlapping, such as in a multi-threading, pipelined, or predictive processing environment. As an alternative example, a first module may be embodied by a first processor executing a first set of instructions, and a second module may be embodied by a second processor executing a second set of instructions. As another alternative example, a particular module may be embodied partially by a first processor executing at least a portion of a particular set of instructions and embodied partially by a second processor executing at least a portion of the particular set of instructions. Other combinations of instructions, a program, an application, software, or code, etc. in conjunction with at least one processor or other execution machinery may be utilized to realize one or more modules, logic, or circuitry, etc. to implement any of the processing algorithms described herein.
Those skilled in the art will recognize that at least a portion of the devices and/or processes described herein can be integrated into a data processing system. Those having skill in the art will recognize that a data processing system generally includes one or more of a system unit housing, a video display device, memory such as volatile or non-volatile memory, processors such as microprocessors or digital signal processors, computational entities such as operating systems, drivers, graphical user interfaces, and applications programs, one or more interaction devices (e.g., a touch pad, a touch screen, an antenna, etc.), and/or control systems including feedback loops and control motors (e.g., feedback for sensing position and/or velocity; control motors for moving and/or adjusting components and/or quantities). A data processing system may be implemented utilizing suitable commercially available components, such as those typically found in data computing/communication and/or network computing/communication systems.
For the purposes of this application, “cloud” computing may be understood as described in the cloud computing literature. For example, cloud computing may be methods and/or systems for the delivery of computational capacity and/or storage capacity as a service. The “cloud” may refer to one or more hardware and/or software components that deliver or assist in the delivery of computational and/or storage capacity, including, but not limited to, one or more of a client, an application, a platform, an infrastructure, and/or a server The cloud may refer to any of the hardware and/or software associated with a client, an application, a platform, an infrastructure, and/or a server. For example, cloud and cloud computing may refer to one or more of a computer, a processor, a storage medium, a router, a switch, a modem, a virtual machine (e.g., a virtual server), a data center, an operating system, a middleware, a firmware, a hardware back-end, a software back-end, and/or a software application. A cloud may refer to a private cloud, a public cloud, a hybrid cloud, and/or a community cloud. A cloud may be a shared pool of configurable computing resources, which may be public, private, semi-private, distributable, scaleable, flexible, temporary, virtual, and/or physical. A cloud or cloud service may be delivered over one or more types of network, e.g., a mobile communication network, and the Internet.
As used in this application, a cloud or a cloud service may include one or more of infrastructure-as-a-service (“IaaS”), platform-as-a-service (“PaaS”), software-as-a-service (“SaaS”), and/or desktop-as-a-service (“DaaS”). As a non-exclusive example, IaaS may include, e.g., one or more virtual server instantiations that may start, stop, access, and/or configure virtual servers and/or storage centers (e.g., providing one or more processors, storage space, and/or network resources on-demand, e.g., EMC and Rackspace). PaaS may include, e.g., one or more software and/or development tools hosted on an infrastructure (e.g., a computing platform and/or a solution stack from which the client can create software interfaces and applications, e.g., Microsoft Azure). SaaS may include, e.g., software hosted by a service provider and accessible over a network (e.g., the software for the application and/or the data associated with that software application may be kept on the network, e.g., Google Apps, SalesForce). DaaS may include, e.g., providing desktop, applications, data, and/or services for the user over a network (e.g., providing a multi-application framework, the applications in the framework, the data associated with the applications, and/or services related to the applications and/or the data over the network, e.g., Citrix). The foregoing is intended to be exemplary of the types of systems and/or methods referred to in this application as “cloud” or “cloud computing” and should not be considered complete or exhaustive.
Those skilled in the art will recognize that it is common within the art to implement devices and/or processes and/or systems, and thereafter use engineering and/or other practices to integrate such implemented devices and/or processes and/or systems into more comprehensive devices and/or processes and/or systems. That is, at least a portion of the devices and/or processes and/or systems described herein can be integrated into other devices and/or processes and/or systems via a reasonable amount of experimentation. Those having skill in the art will recognize that examples of such other devices and/or processes and/or systems might include—as appropriate to context and application—all or part of devices and/or processes and/or systems of (a) an air conveyance (e.g., an airplane, rocket, helicopter, etc.), (b) a ground conveyance (e.g., a car, truck, locomotive, tank, armored personnel carrier, etc.), (c) a building (e.g., a home, warehouse, office, etc.), (d) an appliance (e.g., a refrigerator, a washing machine, a dryer, etc.), (e) a communications system (e.g., a networked system, a telephone system, a Voice over IP system, etc.), (f) a business entity (e.g., an Internet Service Provider (ISP) entity such as Comcast Cable, Qwest, Southwestern Bell, etc.), or (g) a wired/wireless services entity (e.g., Sprint, Cingular, Nextel, etc.), etc.
In certain cases, use of a system or method may occur in a territory even if components are located outside the territory. For example, in a distributed computing context, use of a distributed computing system may occur in a territory even though parts of the system may be located outside of the territory (e.g., relay, server, processor, signal-bearing medium, transmitting computer, receiving computer, etc. located outside the territory). A sale of a system or method may likewise occur in a territory even if components of the system or method are located and/or used outside the territory. Further, implementation of at least part of a system for performing a method in one territory does not preclude use of the system in another territory.
One skilled in the art will recognize that the herein described components (e.g., operations), devices, objects, and the discussion accompanying them are used as examples for the sake of conceptual clarity and that various configuration modifications are contemplated. Consequently, as used herein, the specific exemplars set forth and the accompanying discussion are intended to be representative of their more general classes. In general, use of any specific exemplar is intended to be representative of its class, and the non-inclusion of specific components (e.g., operations), devices, and objects should not be taken limiting.
With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations are not expressly set forth herein for sake of clarity.
The herein described subject matter sometimes illustrates different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely exemplary, and that in fact many other architectures may be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively “associated” such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality can be seen as “associated with” each other such that the desired functionality is achieved, irrespective of architectures or intermedial components. Likewise, any two components so associated can also be viewed as being “operably connected”, or “operably coupled,” to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable,” to each other to achieve the desired functionality. Specific examples of operably couplable include but are not limited to physically mateable and/or physically interacting components, and/or wirelessly interactable, and/or wirelessly interacting components, and/or logically interacting, and/or logically interactable components.
In some instances, one or more components may be referred to herein as “configured to,” “configured by,” “configurable to,” “operable/operative to,” “adapted/adaptable,” “able to,” “conformable/conformed to,” etc. Those skilled in the art will recognize that such terms (e.g. “configured to”) can generally encompass active-state components and/or inactive-state components and/or standby-state components, unless context requires otherwise.
This application may make reference to one or more trademarks, e.g., a word, letter, symbol, or device adopted by one manufacturer or merchant and used to identify and distinguish his or her product from those of others. Trademark names used herein are set forth in such language that makes clear their identity, that distinguishes them from common descriptive nouns, that have fixed and definite meanings, and, in many if not all cases, are accompanied by other specific identification using terms not covered by trademark. In addition, trademark names used herein have meanings that are well-known and defined in the literature, and do not refer to products or compounds protected by trade secrets in order to divine their meaning. All trademarks referenced in this application are the property of their respective owners, and the appearance of one or more trademarks in this application does not diminish or otherwise adversely affect the validity of the one or more trademarks. All trademarks, registered or unregistered, that appear in this application are assumed to include a proper trademark symbol, e.g., the circle R or [trade], even when such trademark symbol does not explicitly appear next to the trademark. To the extent a trademark is used in a descriptive manner to refer to a product or process, that trademark should be interpreted to represent the corresponding product or process as of the date of the filing of this patent application.
While particular aspects of the present subject matter described herein have been shown and described, it will be apparent to those skilled in the art that, based upon the teachings herein, changes and modifications may be made without departing from the subject matter described herein and its broader aspects and, therefore, the appended claims are to encompass within their scope all such changes and modifications as are within the true spirit and scope of the subject matter described herein. It will be understood by those within the art that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to claims containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, and C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). In those instances where a convention analogous to “at least one of A, B, or C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, or C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that typically a disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms unless context dictates otherwise. For example, the phrase “A or B” will be typically understood to include the possibilities of “A” or “B” or “A and B.”
With respect to the appended claims, those skilled in the art will appreciate that recited operations therein may generally be performed in any order. Also, although various operational flows are presented in a sequence(s), it should be understood that the various operations may be performed in other orders than those which are illustrated, or may be performed concurrently. Examples of such alternate orderings may include overlapping, interleaved, interrupted, reordered, incremental, preparatory, supplemental, simultaneous, reverse, or other variant orderings, unless context dictates otherwise. Furthermore, terms like “responsive to,” “related to,” or other past-tense adjectives are generally not intended to exclude such variants, unless context dictates otherwise.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
Claims
1.-326. (canceled)
327. An image capturing apparatus capable of providing region-based focus in part using focus bracketing, comprising:
- circuitry configured for capturing at least one first image at a first focal length and at least one second image at a second focal length at least partially via adjustment of a focal length associated with at least one lens and at least one image sensor;
- circuitry configured for aligning the at least one first image and the at least one second image at least partially based on correlating one or more common features present at substantially similar pixel locations of the at least one first image and the at least one second image; and
- circuitry configured for constructing at least one composite image at least partially via blending at least one region of the at least one first image at the first focal length with at least one region of the at least one second image at the second focal length.
328. The image capturing apparatus of claim 327, wherein circuitry configured for capturing at least one first image at a first focal length and at least one second image at a second focal length at least partially via adjustment of a focal length associated with at least one lens and at least one image sensor of the image capturing apparatus comprises:
- circuitry configured for capturing a primary image with a lens at a primary position; and
- circuitry configured for capturing another image with the at least one lens at another position.
329. The image capturing apparatus of claim 328, wherein circuitry configured for capturing another image with the at least one lens at another position comprises:
- circuitry configured for moving the at least one lens to the another position while the at least one image sensor remains stationary; and
- circuitry configured for capturing the another image with the at least one lens at the another position.
330. The image capturing apparatus of claim 328, wherein circuitry configured for capturing another image with the at least one lens at another position comprises:
- circuitry configured for capturing one or more additional images with the at least one lens at one or more focal positions.
331. The image capturing apparatus of claim 328, wherein circuitry configured for capturing another image with the at least one lens at another position comprises:
- circuitry configured for capturing another image with the at least one lens at another position relative to the at least one image sensor.
332. The image capturing apparatus of claim 331, wherein circuitry configured for capturing another image with the at least one lens at another position relative to the at least one image sensor comprises:
- circuitry configured for moving the at least one image sensor while the at least one lens remains stationary; and
- circuitry configured for capturing the another image with the at least one other lens at the another position relative to the at least one image sensor subsequent to moving the at least one image sensor while the at least one lens remains stationary.
333. The image capturing apparatus of claim 327, wherein circuitry configured for capturing at least one first image at a first focal length and at least one second image at a second focal length at least partially via adjustment of a focal length associated with at least one lens and at least one image sensor of the image capturing apparatus comprises:
- circuitry configured for capturing a primary image with a lens at a primary position; and
- circuitry configured for capturing another image subsequent to moving the at least one lens to at least another position.
334. The image capturing apparatus of claim 327, wherein circuitry configured for capturing at least one first image at a first focal length and at least one second image at a second focal length at least partially via adjustment of a focal length associated with at least one lens and at least one image sensor of the image capturing apparatus comprises:
- circuitry configured for capturing at least one first image at a first focal length and at least one second image at a second focal length at least partially via adjustment of a focal length associated with at least one lens and at least one photo-detector of the image capturing apparatus.
335. The image capturing apparatus of claim 327, wherein circuitry configured for capturing at least one first image at a first focal length and at least one second image at a second focal length at least partially via adjustment of a focal length associated with at least one lens and at least one image sensor of the image capturing apparatus comprises:
- circuitry configured for capturing at least one first image at a first focal length and at least one second image at a second focal length at least partially via moving the at least one lens to at least one other position while the at least one image sensor remains stationary.
336. The image capturing apparatus of claim 327, wherein circuitry configured for capturing at least one first image at a first focal length and at least one second image at a second focal length at least partially via adjustment of a focal length associated with at least one lens and at least one image sensor of the image capturing apparatus comprises:
- circuitry configured for capturing at least one first image at a first focal position relative to an image plane of the at least one image sensor and at least one second image at a second focal position relative to an image plane of the at least one image sensor.
337. The image capturing apparatus of claim 327, wherein circuitry configured for aligning the at least one first image and the at least one second image at least partially based on correlating one or more common features present at substantially similar pixel locations of the at least one first image and the at least one second image comprises:
- circuitry configured for mapping at least one region of the at least one first image with at least one region of the at least one second image.
338. The image capturing apparatus of claim 337, wherein circuitry configured for mapping at least one region of the at least one first image with at least one region of the at least one second image comprises:
- circuitry configured for mapping at least one out-of-focus region of the at least one first image to at least one corresponding region of the at least one second image.
339. The image capturing apparatus of claim 327, wherein circuitry configured for constructing at least one composite image at least partially via blending at least one region of the at least one first image at the first focal length with at least one region of the at least one second image at the second focal length comprises:
- circuitry configured for constructing the at least one composite image in response to the at least one region of the at least one first image having a sharper focus relative to the focus of the at least one region of the at least one second image.
340. The image capturing apparatus of claim 327, wherein circuitry configured for constructing at least one composite image at least partially via blending at least one region of the at least one first image at the first focal length with at least one region of the at least one second image at the second focal length comprises:
- circuitry configured for constructing the at least one composite image to provide at least one desired focus.
341. The image capturing apparatus of claim 340, wherein circuitry configured for constructing the at least one composite image to provide at least one desired focus comprises:
- circuitry configured for providing one or more interaction devices for receiving at least one desired focus for constructing at least one composite image with the at least one desired focus.
342. The image capturing apparatus of claim 327, wherein circuitry configured for constructing at least one composite image at least partially via blending at least one region of the at least one first image at the first focal length with at least one region of the at least one second image at the second focal length comprises:
- circuitry configured for selecting at least one of the at least one first image or the at least one second image to provide at least one image with at least one desired focus at least partially based on at least one interaction via at least one touch screen of the image capturing apparatus.
343. The image capturing apparatus of claim 327, wherein circuitry configured for constructing at least one composite image at least partially via blending at least one region of the at least one first image at the first focal length with at least one region of the at least one second image at the second focal length comprises:
- circuitry configured for constructing at least one composite image including at least one region with at least one desired focus selected at least partially based on at least one interaction via at least one touch screen of the image capturing apparatus.
344. The image capturing apparatus of claim 343, wherein circuitry configured for constructing at least one composite image including at least one region with at least one desired focus selected at least partially based on at least one interaction via at least one touch screen of the image capturing apparatus comprises:
- circuitry configured for constructing at least one composite image including at least one desired in-focus region and at least one desired out-of-focus region, the at least one composite image constructed at least partially based on at least one interaction via at least one touch screen of the image capturing apparatus.
345. A method for an image capturing apparatus capable of providing region-based focus in part using focus bracketing, comprising:
- capturing at least one first image at a first focal length and at least one second image at a second focal length at least partially via adjustment of a focal length associated with at least one lens and at least one image sensor of the image capturing apparatus;
- aligning the at least one first image and the at least one second image at least partially based on correlating one or more common features present at substantially similar pixel locations of the at least one first image and the at least one second image; and
- constructing at least one composite image at least partially via blending at least one region of the at least one first image at the first focal length with at least one region of the at least one second image at the second focal length.
346. An image capturing apparatus capable of providing region-based focus in part using focus bracketing, comprising:
- means for capturing at least one first image at a first focal length and at least one second image at a second focal length at least partially via adjustment of a focal length associated with at least one lens and at least one image sensor of the image capturing apparatus;
- means for aligning the at least one first image and the at least one second image at least partially based on correlating one or more common features present at substantially similar pixel locations of the at least one first image and the at least one second image; and
- means for constructing at least one composite image at least partially via blending at least one region of the at least one first image at the first focal length with at least one region of the at least one second image at the second focal length.
Type: Application
Filed: May 19, 2016
Publication Date: Nov 24, 2016
Inventors: W. Daniel Hillis (Encino, CA), Nathan P. Myhrvold (Medina, WA), Lowell L. Wood (Bellevue, WA)
Application Number: 15/159,517