Camera to capture multiple images at multiple focus positions
An example of a camera includes an optical element moving between first and second focus positions upon application of an electrical signal, a sensor capturing first and second images transmitted by the optical element in the respective first and second focus positions, a first storage location recording first images and a second storage location recording second images, and a screen simultaneously displaying a most recent first image from the first storage cation and a most recent second image from the second storage location. Another example of a camera includes the optical element and sensor, as well as a processor to: control timing of application of an electrical signal to the optical element to regulate changing of the optical element between first and second focus positions, synchronize capture by the sensor of first and second images, and control streaming of most recent first and second images for simultaneous viewing.
Latest Hewlett Packard Patents:
Consumers appreciate ease of use and value in their electronic devices. They also appreciate new and useful features in their electronic devices. Designers and manufacturers of electronic devices may, therefore, endeavor to create and provide such electronic devices to these consumers.
The following detailed description references the drawings, wherein:
Video cameras require a user of the device to choose a particular view of an area or subject (e.g., a wide-angle view of an area or a narrower view of a single subject). An example is a swim meet where a user may desire to have a video of the entire pool area (from a viewpoint within the stands), but at some points, wants to use the zoom feature of the camera to obtain a close perspective of an individual swimmer. The ability to capture and display both images with a camera, rather than having to choose between one of them, would be a welcome and useful feature.
Commercial video producers (such as TV stations at sporting events or concerts) can use two physically separate cameras controlled by two different operators to achieve this effect. These two feeds are edited either in real-time (with producers and operators) or after the fact. This use of two cameras and subsequent post processing adds expense and complexity to such capture and display of these two different images.
One way in which multiple images may be captured by a single camera for simultaneous display involves the use of two different optical elements that capture and transmit different views of an image (e.g., wide angle and telephoto) to two separate sensors. The use of such two different optical elements and sensors, however, adds to the cost and complexity of such a camera.
Another technique or design may utilize a single optical element with a beam splitter and/or prism element to transmit a first image to a first sensor and a second image to a separate second sensor. However, the use of such beam splitters and/or prism elements along with separate first and second sensors also adds to the cost and complexity associated with the design of such cameras.
An example of a block diagram of a camera 10 that is directed to addressing these challenges is illustrated in
As used herein, the term “processor” is defined as including, but not necessarily limited to, an instruction execution system such as a computer/processor based system, an Application Specific Integrated Circuit (ASIC), a computing device, or a hardware and/or software system that can fetch or obtain the logic from a non-volatile storage medium and execute the instructions contained therein. “Processor” can also include any controller, state-machine, microprocessor, cloud-based utility, service or feature, or any other analogue, digital and/or mechanical implementation thereof.
As used herein, the term “camera” is defined as including, but not necessarily limited to, a device that records images in digital, analogue or other format for either subsequent static and/or continuous display, or real-time streaming to, for example, a screen. The images can be visible or invisible (e.g., infrared or ultraviolet). Examples of a “camera” include, but are not necessarily limited to, video recorders, digital cameras, video cameras, projectors, or film-based cameras. The camera can be a separate stand-alone unit or integrated into another device such as a mobile phone, computer, personal digital assistant (PDA) or tablet.
As used herein, the term “optical element” is defined as including, but not necessarily limited to, one or more lens, mirrors, or a combination thereof. The optical element may be made, for example, from glass, liquid, ceramic, semiconductor material, or a combination thereof.
As used herein, the term “sensor” is defined as including, but not necessarily limited to, a Charge Coupled Device (CCD), Complementary Metal-Oxide Semiconductor (CMOS) device, film, light sensitive material, or similar device that captures images transmitted by an optical element. The images may be of any wavelength including, without limitation, both visible and invisible (e.g., infrared or ultraviolet).
As used herein, the term “screen” is defined as including, but not necessarily limited to, a device for displaying multi-dimensional images. Examples include, but are not necessarily limited to, a monitor, a Liquid Crystal Display (LCD), a television screen, a plasma display, a projection surface, a Light Emitting Diode (LED) display, a holographic display, or a Cathode Ray Tube (CRT) display.
As used herein, the term “storage location” is defined as including, but not necessarily limited to, anon-volatile storage medium. As used herein, the term “oscillator” is defined as including, but not necessarily limited to a device and/or circuit that produces a repetitive electrical signal. The electrical signal may be an electrical impulse, such as a varying voltage or a varying current. The electrical signal may also have a variety of shapes including, without limitation, sinusoidal, square, or triangular.
Referring again to
Also in this example of camera 10, the electrical signal that is applied to optical element 12, which causes it to move from the first focus position to the second focus position and from the second focus position to the first focus position, is the same. However, in other examples, the electrical signal that is applied to optical element 12 which causes it to move from the first focus position to the second focus position is different than the electrical signal that is applied which causes optical element 12 to move from the second focus position to the first focus position.
As can be seen in
As can also be seen in
In this example, the first and second images are recorded and stored in respective first storage location 20 and second storage location 24 in a first-in, first-out (FIFO) basis and may be retrieved at any time for either real-time or subsequent display screen 28. They may be recorded and stored in first storage location 20 and/or second storage location 24 differently in other examples. Additionally, one or more of the first and second images may be erased, deleted, or otherwise removed front respective first storage location 20 and second storage location 24 at any time by an end-user of camera 10 either either before or after display on screen 28.
In the example diagram of camera 10 illustrated in
This design and configuration of camera 10 saves cost and complexity over using two separate cameras to provide the two different sequences of the first and second images or using two separate optical elements and/or sensors in a single camera to provide the two different sequences of the first and second images. It also saves cost and complexity over other designs by not having to utilize abeam splitter and/or prism to transmit either of the first or second images from optical element 12 to sensor 16.
In other examples of camera 10, the rate of application of the electrical signal to move optical element 12 between the first and second focus positions can be greater than sixty (60) times per second (e.g., one hundred twenty (120) times per second) or less than sixty (60) times per second (e.g., forty eight (48) times per second). Additionally or alternatively, in yet other examples of camera 10, more than one image may be captured by sensor 16 each time optical element 12 is in the first and/or second focus position.
As can further be seen in
As generally indicated by dashed arrow 38 in
Another example of a block diagram of a camera 40 is shown in
Also in this example of camera 40, the electrical signal that is applied to optical element 42, which causes it to move from the first focus position to the second focus position and from the second focus position to the first focus position, is the same. However, in other examples, the electrical signal is that is applied to optical element 42 which causes it to move from the first focus position to the second focus position is different than the electrical signal that is applied which causes optical element 42 to move from the second focus position to the first focus position.
As can be seen in
As can also be seen in
In the example diagram of camera 40 illustrated in
This design and configuration of camera 40 saves cost and complexity over using two separate cameras to provide the two different sequences of the first and second images or using two separate optical elements and/or sensors in a single camera to provide the two different sequences of the first and second images. It also saves cost and complexity over other designs by not having to utilize a beam splitter and/or prism to transmit either of the first or second images from optical element 42 to sensor 46.
In other examples of camera 40, the rate of application of the electrical signal to move optical element 42 between the first and second focus positions can be greater than sixty (60) times per second (e.g., one hundred twenty (120) times per second) or less than sixty (60) times per second (e.g., forth eight (48) times per second). Additionally or alternatively, in yet other examples of camera 40, more than one image may be captured by sensor 46 each time optical element 42 is in the first and/or second focus position.
As can further be seen in
An example of simultaneous display of a most recent first image 60 and a most recent second image 62 on a screen 64 is shown in
As can be seen in
Another example of simultaneous display of a most recent first image 66 and a most recent second image 68 on a screen 70 is shown in
As can be seen in
An additional example of simultaneous display of a most recent first image 72 and a most recent second image 74 on a screen 76 is shown in
As can be seen in
An example of simultaneous display of a most recent first image 82 and a most recent third image 84 on a screen 86 is shown in
As can be seen in
An example of a non-volatile storage medium 88 is shown in
An example of additional potential elements of non-volatile storage medium 88 is shown in
The most recent first image may occupy a first position on the screen and the most recent second image may occupy a second position on the screen within the first position. In such cases, non-volatile storage medium 88 may include additional instructions that, when executed by the processor, cause the processor to allow an end user to select a location of the second position on the screen within the first position, as generally indicated by block 98 in
Although several examples have been described and illustrated in detail, it is to be clearly understood that the same are intended by way of illustration and example only. These examples are not intended to be exhaustive or to limit the invention to the precise form or to the exemplary embodiments disclosed. Modifications and variations may well be apparent to those of ordinary skill in the art. For example, the second focus position of the optical element, the second images transmitted thereby to the sensor, and streamed by the processor and/or displayed on the screen may correspond to a digital zoom of the first image transmitted by the optical element in the first focus position. The spirit and scope of the present invention are to be limited only by the terms of the following claims.
Additionally, reference to an element in the singular is not intended to mean one and only one, unless explicitly so stated, but rather means one or more. Moreover, no element or component is intended to be dedicated to the public regardless of whether the element or component is explicitly recited in the following claims.
Claims
1. A camera, comprising:
- an optical element;
- a sensor both to capture first images transmitted by the optical element in a first focus position and to capture second images transmitted by the optical element in a second focus position;
- a first storage location to record first images captured by the sensor;
- a second storage location to record second images captured by the sensor; and
- a processor to: control a timing of an application of an electrical signal to the optical element to alternately change a focus of the optical element between the first focus position and the second focus position; synchronize capture by the sensor of first images transmitted by the optical element in the first focus position; synchronize capture by the sensor of second images transmitted by the optical element in the second focus position; control a streaming to a screen of a most recent first image transmitted by the optical element and captured by the sensor; and control a streaming to the screen of a most recent second image transmitted by the optical element and captured by the sensor so that the most recent first and second images are simultaneously displayed.
2. The camera of claim 1, wherein a location of one of the most recent first image and a location of the most recent second image displayed by the screen is selectable by an end user.
3. The camera of claim 1, wherein the optical element assumes a third focus position upon application of a different electrical signal, the sensor captures third images transmitted by the optical element in the third focus position, the processor controls timing of an application of one of the different electrical signal to the optical element to regulate a rate of change of the optical element between the first focus position and the third focus position rather than the first focus position and the second focus position, the processor synchronizes capture by the sensor of the third images transmitted by the optical element in the third focus position, and the screen simultaneously displays the most recent first image streamed by the processor and the most recent third image streamed by the processor in place of the most recent second image.
4. The camera of claim 1, wherein the most recent first image occupies a first position on the screen and the most recent second image occupies a second position on the screen within the first position.
5. The camera of claim 4, wherein a location of the second position on the screen within the first position is selectable.
6. The camera of claim 1, wherein the first and second images are transmitted by the optical element to the sensor independent of use of a beam splitter and further wherein the first and second images are transmitted by the optical element to the sensor independent of use of a prism.
7. The camera of claim 1, wherein the optical element assumes a third focus position upon application of a different electrical signal.
8. A non-transitory, non-volatile storage medium including instruction that, when executed by a processor, cause the processor to:
- control a timing of an application of an electrical signal to an optical element to alternately change a focus of the optical element between a first focus position and a second focus position;
- control a streaming to a screen of a most recent first image transmitted by the optical element and captured by a sensor;
- control a streaming to the screen of a most recent second image transmitted by the optical element and captured by the sensor so that the most recent first and second images are simultaneously displayed.
9. The non-transitory, non-volatile storage medium of claim 8, further comprising additional instructions that, when executed by the processor, cause the processor to allow an end user to selectably control a location of one of the most recent first image and a location of the most recent second image displayed by the screen.
10. The non-transitory, non-volatile storage medium of claim 8, wherein the most recent first image occupies a first position on the screen and the most recent second image occupies a second position on the screen within the first position.
11. The non-transitory, non-volatile storage medium of claim 10, further comprising additional instructions that, when executed by the processor, cause the processor to allow an end user to select a location of the second position on the screen within the first position.
6738073 | May 18, 2004 | Park et al. |
7443447 | October 28, 2008 | Shirakawa |
7453517 | November 18, 2008 | Fujimoto et al. |
7477307 | January 13, 2009 | Suzuki et al. |
7684687 | March 23, 2010 | Furuya |
7725016 | May 25, 2010 | Lee et al. |
8072486 | December 6, 2011 | Namba et al. |
8294808 | October 23, 2012 | Caron et al. |
8633998 | January 21, 2014 | Hayashi et al. |
9204788 | December 8, 2015 | Yu |
9485407 | November 1, 2016 | Tsai |
20050206773 | September 22, 2005 | Kim et al. |
20090066833 | March 12, 2009 | Helwegen et al. |
20090251482 | October 8, 2009 | Kondo et al. |
20100302403 | December 2, 2010 | Anderson |
20110050963 | March 3, 2011 | Watabe |
20110234881 | September 29, 2011 | Wakabayashi et al. |
20110242369 | October 6, 2011 | Misawa et al. |
20130162759 | June 27, 2013 | Alakarhu et al. |
2007259155 | October 2007 | JP |
2008298685 | December 2008 | JP |
2010141671 | June 2010 | JP |
WO 2008107713 | September 2008 | WO |
WO-2011050496 | May 2011 | WO |
- Yong, Xiao. Single Lens Multi-Ocular Stereovision Using Prism. University of Singapore. 2005.
- Lee, et al. Dual camera based wide-view imaging system and real-time image registration algorithm. 11th International Conference on Control, Automation and Systems. Oct. 26-29.
- Zhang, et al. Fluidic Zoom-Lens-on-a-Chip with wide Field-of-View Tuning Range, IEEE Photonics Technology Letters, vol. 16. No. 10. Oct. 2004.
Type: Grant
Filed: Dec 5, 2012
Date of Patent: May 30, 2017
Patent Publication Number: 20140152870
Assignee: Hewlett-Packard Development Company, L.P. (Houston, TX)
Inventors: John L. Ortman (Westerville, OH), Jonathan David Gibson (Austin, TX), Ronald Monson (Locust Grove, GA), Matthew Stuempfle (Raleigh, NC)
Primary Examiner: John Villecco
Application Number: 13/705,563
International Classification: H04N 5/232 (20060101); H04N 5/262 (20060101); H04N 5/225 (20060101);