Event Multiplexer For Managing The Capture of Images
An image capture system for capturing images of an object, such as the Earth. The image capture system includes a moving platform, at least two image capture devices, a position system, an event multiplexer and a computer system. The image capture devices are mounted to the moving platform. Each image capture device has a sensor for capturing an image and an event channel providing an event signal indicating the capturing of an image by the sensor. The position system records data indicative of a position as a function of time related to the moving platform. The event multiplexer has at least two image capture inputs and at least one output port. Each image capture input receives event signals from the event channel of one of the image capture devices. The event multiplexer outputs information indicative of an order of events indicated by the event signals, and identification of image capture devices providing the event signals. The computer system receives and stores the information indicative of the order of events indicated by the event signals, and identification of image capture devices providing the event signals.
The present patent application claims priority to the provisional patent application identified by U.S. Ser. No. 60/901,444, which was filed on Feb. 15, 2007, the entire content of which is hereby incorporated herein by reference.
STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH AND DEVELOPMENTNot Applicable.
BACKGROUND OF THE INVENTIONAs background, in the remote sensing/aerial imaging industry, imagery is used to capture views of a geographic area and to be able to measure objects and structures within the images as well as to be able to determine geographic locations of points within the image. These are generally referred to as “geo-referenced images” and come in two basic categories:
Captured Imagery—these images have the appearance they were captured by the camera or sensor employed.
Projected Imagery—these images have been processed and converted such that they confirm to a mathematical projection.
All imagery starts as captured imagery, but as most software cannot geo-reference captured imagery, that imagery is then reprocessed to create the projected imagery. The most common form of projected imagery is the ortho-rectified image. This process aligns the image to an orthogonal or rectilinear grid (composed of rectangles). The input image used to create an ortho-rectified image is a nadir image—that is, an image captured with the camera pointing straight down. It is often quite desirable to combine multiple images into a larger composite image such that the image covers a larger geographic area on the ground. The most common form of this composite image is the “ortho-mosaic image” which is an image created from a series of overlapping or adjacent nadir images that are mathematically combined into a single ortho-rectified image.
When creating an ortho-mosaic, this same ortho-rectification process is used, however, instead of using only a single input nadir image, a collection of overlapping or adjacent nadir images are used and they are combined to form a single composite ortho-rectified image known as an ortho-mosaic. In general, the ortho-mosaic process entails the following steps:
A rectilinear grid is created, which results in an ortho-mosaic image where every grid pixel covers the same amount of area on the ground.
The location of each grid pixel is determined from the mathematical definition of the grid. Generally, this means the grid is given an X and Y starting or origin location and an X and Y size for the grid pixels. Thus, the location of any pixel is simply the origin location plus the number of pixels times the size of each pixel. In mathematical terms: Xpixel=Xorigin+Xsize×Columnpixel and Ypixel=Yorigin+Ysize×Rowpixel.
The available nadir images are checked to see if they cover the same point on the ground as the grid pixel being filled. If so, a mathematical formula is used to determine where that point on the ground projects up onto the camera's pixel image map and that resulting pixel value is then transferred to the grid pixel.
Because the rectilinear grids used for the ortho-mosaic are generally the same grids used for creating maps, the ortho-mosaic images bear a striking similarity to maps and as such, are generally very easy to use from a direction and orientation standpoint.
In producing the geo-referenced aerial images, hardware and software systems designed for georeferencing airborne sensor data exist and are identified herein as a “POS”, i.e., a position and orientation system. For example, a system produced by Applanix Corporation of Richmond Hill, Ontario, Canada and sold under the trademark “POS AV” provides a hardware and software system for directly georeferencing sensor data. Direct Georeferencing is the direct measurement of sensor position and orientation (also known as the exterior orientation parameters), without the need for additional ground information over the project area. These parameters allow data from the airborne sensor to be georeferenced to the Earth or local mapping frame. Examples of airborne sensors include: aerial cameras (digital or film-based), multi-spectral or hyper-spectral scanners, SAR, or LIDAR.
The POS system, such as the POS AV system, was mounted on a moving platform, such as an airplane, such that the airborne sensor was pointed toward the Earth. The positioning system received position signals from a satellite constellation and also received time signals from an accurate clock. The sensor was controlled by a computer running flight management software to take images. Signals indicative of the taking of an image were sent from the sensor to the positioning system to record the time and position where the image was taken.
However, the prior POS systems only included one or two ports for recording the time of sensor capture and position of only one or two independent sensors. The industry standard method for solving this problem is to slave one or more cameras to another camera and then to actuate all of the cameras simultaneously. However, this does not permit independent actuation of each of the cameras.
If more than two independent sensors are to be used with the prior positioning system, then a separate system for recording the time and position of sensor capture must be developed. It is to such a system for recording the time of sensor that the present invention is directed.
Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction, experiments, exemplary data, and/or the arrangement of the components set forth in the following description or illustrated in the drawings. The invention is capable of other embodiments or being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for purposes of description and should not be regarded as limiting.
Referring to the drawings, and in particular to
The images can be oblique images, orthogonal images, or nadir images, or combinations thereof.
As shown in
In certain embodiments depicted in
The image capture devices 14 are mounted to the moving platform 21, and once mounted are typically calibrated so that the exact position and orientation of the image capture devices 14 are known with respect to at least a portion of the moving platform 21. For example, as shown in
Each of the image capture devices 14 has a sensor (not shown) for capturing sensor data, such as an image. Each of the image capture devices 14 is also provided with an event channel 26 providing an event signal indicating the capturing of an image by the sensor. The event channel 26 can be any device that provides a signal coincident with the capturing of the image, such as a flash output. The sensor can capture the image in an analog manner, digital manner, or on film. Further, it should be understood that the image can be stored electronically, optically, or provided on a film-based medium.
The event multiplexer system 18 has at least one image capture input 28 and at least one output port 30. In a preferred embodiment the event multiplexer system 18 has at least two image capture inputs 28. Each image capture input 28 receives signals from the event channel 26 of one of the image capture devices 14. The event multiplexer system 18 outputs event signals indicative of an order of events indicated by the signals provided by the image capture devices 14, and an identification (CID) of image capture devices 14 providing the event signals.
The monitoring system 16 records data indicative of the capturing of the images. For example, the monitoring system 16 can record position data as a function of time, time data and/or orientation data. In the embodiments depicted in
In the embodiments depicted in
The computer system 20 receives and stores (preferably in the database 38) the information indicative of the order of events indicated by the event signals, and identification of image capture devices 14 providing the event signals. The computer system 20 optionally also receives and stores the images (preferably in the database 38) generated by the image capture devices 14. The monitoring system 16 records the data indicative of the capturing of images by storing it internally, outputting it to the computer system 20, or outputting such data in any other suitable manner, such as storing such data on an external magnetic or optical storage system. The position related to the moving platform 21 can be provided in any suitable coordinate system, such as an X, Y, Z coordinate system, or a WGS1984 latitude/longitude coordinate system.
Further, the image capture system 10 can be provided with an orientation system, such as an inertial measurement unit 40 for capturing other types of information with respect to the moving platform 21, such as the orientation of the moving platform 21. The inertial measurement unit 40 can be provided with a variety of sensors, such as accelerometers (not shown) for determining the roll, pitch and yaw related to the moving platform 21. Further, it should be understood that the position and/or orientation information does not necessarily have to be a position and/or orientation of the moving platform 21. The position and orientation information is simply related to the moving platform 21, i.e. the position and/or orientation of the moving platform 21 should be able to be determined by the information recorded by the monitoring system 16. For example, the position and orientation information can be provided for a device connected to the moving platform 21. Then, the position and orientation for each image capture device can be determined based upon their known locations relative to the moving platform 21.
Shown in
The memory device 54 can be formed by a first-in-first-out register or any other type of device capable of optionally recording the identity of the image capture device 14 providing the signal, and a sequential order of events based upon the order in which the signals are received.
The computer interface 48 is connected to the memory device 54 to permit the computer system 20 to read the data in the memory device 54. The computer interface 48 can be a device capable of reading the information from the memory device 54 and providing same to the computer system 20. For example, the computer interface 48 can be a serial port, parallel port or a USB port.
For example,
Shown in
Each of the event multiplexers 46a and 46b provide event signals indicative of an order of events based upon the respective signals received from the image capture devices C1-C5. The event signals provided by the event multiplexers 46a and 46b can be provided to different ports of the monitoring system 16, or to different monitoring systems for causing the monitoring system(s) 16 to read and/or record the data indicative of the position as a function of time related to the moving platform 21.
The computer interface 48a communicates with the event multiplexers 46a and 46b to permit the computer system 20 to read the data from the memory devices and store it on a computer readable medium accessible by the computer system 20. The computer interface 48a can be a device capable of reading the information from the memory device(s) and providing same to the computer system 20. For example, the computer interface 48a can be a serial port, parallel port or a USB port.
Shown in
The computer interface 48a and the event multiplexers 46a and 46b can be constructed in a similar manner as the computer interface 48 and event multiplexer 46 described above. Each of the event multiplexers 46a and 46b provide event signals indicative of an order of events based upon the respective signals received from the image capture devices C1-C5 via the multiplexer director 70. The event signals provided by the event multiplexers 46a and 46b can be provided to different ports of the monitoring system 16, or to different monitoring systems 16 for causing the monitoring system(s) 16 to read and/or record the data indicative of the position as a function of time related to the moving platform 21.
The computer interface 48a communicates with the event multiplexers 46a and 46b to permit the computer system 20 to read the data indicative of the order of events E(t) from the respective memory devices and store such data E(t) on a computer readable medium accessible by the computer system 20.
Shown in
Thus, the event multiplexer systems 18, 18a, and 18b record order of event and optionally record identification of camera information. The event multiplexer systems 18, 18a and 18b utilize an external timing device (i.e., the monitoring system 16 or the computer system 20) to correlate event with time and thus location. The event multiplexer system 18c, receives and records absolute time in addition to the order of events and identification of image capture devices 14. Thus, “order of event” could include relative or absolute time.
The image capture system 10a is provided with one or more image capture devices 14a, one or more monitoring systems 16a, one or more event multiplexer systems 18c, and one or more computer systems 20a and one or more inertial measurement unit 40a.
The image capture devices 14a, the monitoring system 16a and inertial measurement unit 40a are similar in construction and function as the image capture devices 14, monitoring system 16 and inertial measurement unit 40 described above.
The event multiplexer system 18c is similar in construction and function as the event multiplexer system 18, 18a and 18b described above, with the exception that the event multiplexer system 18c receives absolute time signals, such as a (PPS) and time sync signal from the monitoring system 16a (or a separate time receiver) and records the absolute time each instance in which a signal is received from one of the image capture devices 14a. The time signal is preferably an accurate signal received from a satellite constellation or an atomic clock. In addition, the absolute time of each event (T, E(t)), and an identification of the image capture device 14a for each event (CID) is passed to the computer system 20a.
The monitoring system 16a passes signals to the computer system 20a indicative of the position of the moving platform 21 as a function of time, and the image capture devices 14a pass the captured images to the computer system 20a. Thus, the computer system 20a is provided with and stores the information for correlating the data indicative of position as a function of time with particular captured images.
The event multiplexer system 18c can also optionally provide event signals e(t) to the monitoring system 16a in a similar manner as the event multiplexer systems 18, 18a and 18b described above, for purposes of redundancy, and/or verification of the data. The event signals e(t) can also optionally be provided to the computer system 20a from the monitoring system 16a and in this instance the computer system 20a optionally compares e(t) with E(t) for calibration, diagnostic or feedback purposes to make sure that the time sync signal is correct.
In using the systems depicted in
In using the system depicted in
It will be understood from the foregoing description that various modifications and changes may be made in the preferred and alternative embodiments of the present invention without departing from its true spirit.
This description is intended for purposes of illustration only and should not be construed in a limiting sense. The scope of this invention should be determined only by the language of the claims that follow. The term “comprising” within the claims is intended to mean “including at least” such that the recited listing of elements in a claim are an open group. “A,” “an” and other singular terms are intended to include the plural forms thereof unless specifically excluded.
Claims
1. An image capture system for capturing images of an object, the image capture system comprising:
- a moving platform;
- at least two image capture devices mounted to the moving platform, each image capture device having a sensor for capturing an image and an event channel providing an event signal indicating the capturing of an image by the sensor;
- a monitoring system recording data indicative of a position and orientation as a function of time related to the moving platform;
- an event multiplexer having an image capture input and at least one output port, each image capture input receiving event signals from the event channel of one of the image capture devices, the event multiplexer outputting information correlating the data indicative of position and orientation as a function of time with particular captured images; and
- a computer system receiving and storing the information.
2. The image capture system of claim 1, wherein the information stored by the computer system includes a sequential order of events based upon the order in which the event signals are received by the event multiplexer.
3. The image capture system of claim 2, wherein the event multiplexer comprises a first-in-first out-register storing the event signals.
4. The image capture system of claim 1, wherein the event multiplexer comprises a quasi-Xor gate having inputs forming the image capture inputs, and an output forming one of the at least one output port.
5. The image capture system of claim 1, wherein the monitoring system includes an input port receiving event signals from the output port of the event multiplexer, and wherein the monitoring system records data indicative of the time at which the event signal was received.
6. The image capture system of claim 5, wherein the computer system stores the data indicative of the time at which the event signal was received by the monitoring system and a camera ID identifying the image capture device from which the event signal was received.
7. The image capture system of claim 1, wherein the information output by the event multiplexer includes absolute time data.
8. The image capture system of claim 1, wherein the information output by the event multiplexer includes relative time data.
9. The image capture system of claim 2, wherein the computer system correlates the information indicative of the order of events indicated by the event signals, and identification of image capture devices providing the event signals to identify particular captured images.
10. The image capture system of claim 1, wherein the at least two image capture devices include at least three image capture devices mounted to a common substrate.
11. The image capture system of claim 1, wherein the at least two image capture devices include at least five image capture devices mounted to point fore, aft, port, starboard and straight down.
12. An image capture system for capturing images of an object, the image capture system comprising:
- at least two image capture devices, each image capture device having a sensor for capturing an image and an output port providing an event signal indicating the capturing of an image by the sensor;
- a monitoring system recording data based upon the capturing of images;
- an event multiplexer having at least two image capture inputs and at least one output port, each image capture input receiving event signals from the output port of one of the image capture devices, the event multiplexer outputting information correlating the data recorded by the monitoring system with particular captured images; and
- a data storage unit storing the information.
13. The image capture system of claim 12, wherein the data storage unit is integrated within one of the image capture devices.
14. The image capture system of claim 12, wherein the at least two image capture devices include at least three image capture devices mounted to a common substrate.
15. The image capture system of claim 12, wherein the at least two image capture devices include at least five image capture devices mounted to point fore, aft, port, starboard and straight down.
Type: Application
Filed: Feb 14, 2008
Publication Date: Aug 28, 2008
Patent Grant number: 8520079
Inventors: Stephen Schultz (West Henrietta, NY), Frank Giuffrida (Honeoye Falls, NY), Matt Kusak (Rochester, NY)
Application Number: 12/031,576
International Classification: H04N 5/228 (20060101);