Spatial referenced photographic system with navigation arrangement
An image system which captures, along with the images, information defining both the position and the orientation of the camera along with the distance to the subject. A video camera is attached to three accelerometers, two gyroscopes, and a rangefinder. Data gathered from these devices and defining the pitch, yaw, and roll of the camera, the camera's acceleration, and the distance to the subject is captured and recorded along with video images. The video images are later stored within a computer's data base along with data defining the position and orientation of the camera and the distance to the subject for each image, this latter data being computed from the captured data. The images may then be presented to the user in a three-dimensional display in which the user can navigate through the images using a joystick device, with the images located in positions corresponding to the positions in space of the objects that were imaged.
Latest Transcenic, Inc. Patents:
This application is a continuation reissue application of reissue application Ser. No. 12/126,664, filed May 23, 2008, to be issued as U.S. Pat. No. Re. 42,289, which is a reissue application of application Ser. No. 09/723,767, filed Nov. 28, 2000, issued on May 23, 2006 as U.S. Pat. No. 7,050,102, which is a continuation application of application Ser. No. 08/894,206, filed Jul. 30, 1997, now U.S. Pat. No. 6,195,122, which is a 35 U.S.C. § 371 national stage U.S. application corresponding to Patent Cooperation Treaty application PCT/US96/01434, filed on Jan. 31, 1996, which is a continuation-in-part of U.S. patent application Ser. No. 08/383,471, filed on Jan. 31, 1995, which is hereby incorporated by reference. In particular, the '471 application contains a more detailed description of the tracking data acquisition unit control circuit 470 (in '471 application FIGS. 16 to 23 and the accompanying text) and exemplary program listings (presented in the Appendices of the '471 application) which may be of interest to those seeking a more detailed understanding of the present invention.
FIELD OF THE INVENTIONThis invention relates to referencing, sorting, and displaying images in a three-dimensional system. More particularly, it relates to a system having an image capturing device that captures images of objects together with spatial reference information defining the absolute position of the image capturing device and the relative position of the object relative to that device. It also relates to image retrieval and display, where the spatial reference information associated with each image facilitates browsing through the images in an organized manner.
BACKGROUND OF THE INVENTIONThe editing of films and video images, i.e., to rearrange action sequences, is well known. However, the movie and video cameras used to capture the images that are later edited do not store with those images any machine-understandable record of image and camera position. Accordingly, the edited films and videos permit one to view the images in only one predetermined order, determined by the editor. If some other ordering of the image presentation is desired, it must be achieved through a difficult manual editing process.
A computerized, interactive editing process is described in a doctoral thesis “Cognitive Space in the Interactive Movie Map: An Investigation of Spatial Learning in Virtual Environments”, by Robert Mohl, 1981, submitted at MIT. In a demonstration carried out using images recorded at Aspen, Colo., the viewer is permitted to select film clips taken by a camera that is arranged to simulate driving down a street. At each intersection, the viewer chooses to turn left, turn right, or to proceed straight ahead. The viewer thereby simulates driving around streets in Aspen, Colo.
In other fields, it is known to gather, along with images, information concerning the position of the camera. Governmental and private agencies use satellites and airplanes to record images of positionally referenced data, such as land features or clouds. Each image frame contains positional references to the image tilt or plane of the camera. Present methods commonly either constrain the orientation of the camera to a fixed position, i.e. up and down, or use features captured in the image frames to derive relative positions and orientations of successive images when combining the image frames to form a map or the like.
Devices are known which combine images by matching features common to each of two or more images, i.e. superimposing.
One aspect of the present invention is recording positional data along with images. A number of methods are known whereby one may locate an object and describe the position of an object relative to a positional reference. For example, a magnetic device is known which can determine its position and orientation within a known magnetic field. Satellite systems and radio signal triangulation can also be used to determine position precisely. Inertial position determination systems are also known and are widely used in inertial navigational systems.
An object of this invention is providing an image data gathering device which encodes positional and/or spatial information by capturing both camera position and camera orientation information along with image data. This information permits images to be joined or sequenced for viewing without the distortions that can result from attempting to match the edges of adjoining images together.
A further object of this invention is providing three-dimensional image reconstruction of objects using frames shot from different viewpoints and perspectives through the provision of a triangulation reference.
Still another object of this invention is providing a camera path map which allows images to be selected based upon the position and orientation of the camera from the map. For example, an operator cam learn the location of an object in a film clip, such as an escalator. Images of the escalator may then be quickly and automatically located by selecting other frames which point to that same escalator from different camera positions.
Another object of the invention is providing a compact and practical image and positional data recording system which uses commonly available equipment. A system having accelerometers mounted directly upon the recorder, eliminating the need for a restrained or gimballed platform, permits greater freedom of motion for the recording device as well as reduced cost and complexity.
Briefly described, the invention resides in a video camera that is integrated with a tracking data acquisition unit containing accelerometers and gimbal-mounted gyroscopes, and optionally a rangefinder. As the operator of the video camera moves about taking a motion picture of the environment, a microprocessor and logic associated with the accelerometers and gyroscopes senses all rotational motions of the camera by means of sensors associated with the gimbals and senses all translational motions of the camera by means of sensors associated with the accelerometers. And the rangefinder provides information to the microprocessor and logic concerning the distance from the camera to the subject being photographed.
From data presented by these sensors, the microprocessor and logic compute and generate a modulated audio signal that is encoded with a continuous record of acceleration in the X, Y and Z directions as well as with a continuous record of the pitch, roll, and yaw of the camera and of the distance to the subject. This audio tracking information data signal is recorded on the audio track of the same video tape upon which the video images are being recorded by the camera. In this manner, the video tape recording captures, along with the sequence of images, the tracking data from which the precise position of the camera, its precise orientation, and the position of the subject may later be computed.
Later on, the recorded audio tracking information data and video data is played back into a computer. Images are selected from the sequence of images and are retained, in compressed form, in a database. Each image is then linked to computed positional information that defines, for each image, the location and orientation of the camera and, optionally, the distance to the subject and the subject location. This positional information is derived through computation from the tracking information retrieved from the video tape audio track, as will be explained below.
Next, special computer programs can aid an individual using the computer in navigating through the images, using the positional information to organize the images in ways that make it easy for the user to browse through the images presented on the graphics screen. Several such programs are described below, and a complete description is presented of a movie mapper program which presents the user with a plan view and elevational views of the camera path plotted as a graph alongside views of selected images, with the path marked to show the position and orientation of the camera. The user, by clicking at any point on this path with a computer mouse, may instantly retrieve and view an image captured at the chosen point. Additionally, by clicking upon diamonds and arrows and the like displayed as overlays superimposed upon an image, the user may command the program to search for and find the nearest image which gives a view rotated slightly to the left or right or which maintains the same view but advances forward in the direction of the view or backward. One may also jump forward and turn simultaneously. A wider field of view may be assembled by assembling automatically chosen images and aligning them into a panorama. The user is thus enabled to navigate through the images in the manner of navigating a boat to the extent permitted by the nature and variety of the images in the data base.
Further objects and advantages are apparent in the drawings and in the detailed description which follows.
Referring to the drawings and especially to
With reference to
Once safely stored within the personal computer 185, the tracking database 324 is reprocessed into a positional database 322 by a tracking database to positional database conversion program 310. Now the image retrieval programs 325, 330, and 335 may be called upon to search through the positional database 322, to retrieve images from the video database 323 based upon camera location, camera orientation, and even object location, and to display the images upon the face of the computer 185.
Tracing the data flow through the system components at a more detailed level will explain the functionality of the preferred embodiment of the spatially referenced camera.
First, the video camera 120 (
Second, the completed recording, stored on the video cassette, is played back to the personal computer 185 on any standard video cassette player or directly from the camera 120 (
Third, the set of three display programs 325, 330, and 335 allow a user to view and select video frames based on the relative position or absolute orientation of the camera.
Tracing the data flow through the individual parts of the spatially referenced video camera 100 at the most detailed level discloses how to build the preferred embodiment of the invention.
Referring to
Two gyroscopes, such as GYRATION Model GE9100A, a vertical gyroscope 400 and a directional gyroscope 410 are also orthogonally mounted on the stable inertial platform 415 relative to each other. The vertical gyroscope 400, aligned along the X or Y axis, measures yaw (rotation about the Z axis), while the directional gyroscope 410, aligned along the Z axis, measures both roll (rotation about the Y axis) and pitch (rotation about the X axis). (The Y axis is assumed to point in the direction of the camera.) The gyroscopes are dual gimballed electronic components that generate a pair of square wave signals which are locked out of phase with respect to each other. The sequence of the rising and falling edges of the square waves relative to each other indicates the angular rotation about the gyroscope's measurement axis experienced by the spatially referenced camera 100. Quadrature decoders 450, 455, 460, such as HEWLETT PACKARD Model HCTL2020, receive the three paired square wave signal outputs of the gyroscopes 400, 410; and for each signal pair, they count the relative sequence of rising and falling edges between the two square wave signals to generate a 16-bit numerical representation of the rotation experienced about each axis. An interrupt is generated following any change in these signals. This interrupt causes a central processing unit 480 (
The inertial platform 415 (
In addition to the accelerometers and the gyroscopes there is a control circuit 470 (
The spatially referenced camera 100 can be configured and used with a laser rangefinder 485 (
The rangefinder 485 (
Referring to
The format of a complete data frame 535 (without range information) is composed of a frame identification pattern 530, which is formed from three repetitions of the 16 bit value 017F hex (at 525). This is followed by data: a 24-bit acceleration value in the X direction 500A, a 16-bit pitch value 505A, a 24-bit acceleration value in the v direction 500B, a 16-bit roll value 505B, a 24-bit acceleration value in the Z direction 500C, and a 16-bit yaw value 505C. The tracking data frame format with range information included 540 starts with a frame identification pattern 550 composed of three repetitions of the value 037F hex (at 545), followed by a 24-bit acceleration value in the X direction 500A, an 8-bit delta (or incremental change) pitch value 505A, a 24-bit acceleration value in the Y direction 500B, an 8-bit delta roll value 505B, a 24-bit acceleration value in the Z direction 500C, an 8-bit delta yaw value 505C, and finally the range data 555 and a gray scale reflectivity value 560. The records 540 containing range information are generated whenever an interrupt from the decoders 450, 455, and 460 indicates that the camera orientation has changed.
Once the video tape is filled with modulated tracking and video data, it is played back. The video output is directly connected to a conventional video digitizer input 180, such as the INTEL SMART VIDEO RECORDER, that is inserted into the ISA, EISA, VESA, PCI, or other accessory port or slot of the personal computer 185. As the video cassette is played back on the video cassette player 130, the video digitizer input 180 captures the video frames of the recorded images and passes digitized video frame data on to the tracking and video data entry and storage program 305 shown in
Simultaneously, the video signal 150 from the video output 140 of the video cassette recorder 130 is captured (step 680), and video frames are selected (step 685). Frame numbers are assigned to the selected video frames (step 690), and at step 665 these frame numbers are concatenated to the lines of tracking data to form tracking data lines. Finally, at step 675 a database of numbered tracking data lines is created and is stored on the disk in a database file called the tracking database 324.
At step 695, the video frame is fed into a video compression program, and the outputted compressed video frame is concatenated or otherwise linked to the same video frame number at 695. Finally, at step 700, a database of numbered and compressed video frames is created and is stored on the disk in a file that is called the video database 323.
The tracking and video data entry and storage program 305, residing on the personal computer 185, essentially builds two related databases. The first is a tracking database 324 composed of enumerated records containing the orientation, the translational acceleration, and optionally the range data originally generated by the tracking data acquisition unit 105. The second is a video database 323 composed of enumerated records containing digitized and compressed images of video frames captured from the video tape originally generated by the video camera 120.
Once all of the recorded tracking and video data are stored, the personal computer 185 converts the tracking database 324 into a positional database 322 via a software module called the tracking database to positional database conversion program 310 (
In the preferred embodiment, an existing image capture computer program is adapted for use to capture, compress, and store selected images in the video database 323, as indicated in steps 680, 685, 690, 695, and 700 of
To implement the step 655, the computer 185 is equipped with a conventional, serial port interrupt driven program that is called upon automatically, whenever the serial input port 176 receives a serial byte of the tracking data signal 170, to retrieve the byte from the serial port UART and to store the byte in some form of circular buffer in RAM from which the bytes may be readily retrieved.
Each time the step 660 is performed (every 1/10th or 1/20th of a second or so), all the bytes currently in this circular buffer are retrieved, combined with historical data, and separated from any partial data frame that is retained as historical data. In this manner, several data frames in the format illustrated at 535 or 540 in
The details of the tracking database to positional database conversion programs 310 are shown in
The program 310 begins at 1002 by initializing the variables. Then at 1004, it opens the input tracking data file 324 (
The data processing steps 1006 are described in
The program steps 1006 control the reading of the input file and enforce the formatting of the output file. In particular, these steps buffer the position records in such a manner that each record processed is complete even though the input records may have been broken up, as described above. In this manner, the program generates one and only one output position record for each frame.
The program begins at 1110 by reading in a data frame. At 1114, if there are no more frames, then at 1130 the files are closed and we are done. If a frame is found, a data record is retrieved from the frame at 1112. At 1118, if a record is found, it is processed at 1120 and written out to the positional database 322 at step 1122. Whether or not a record is found, program control continues at step 1124 where any unprocessed residual data is prepared (or added to) the next frame. Program control then returns to 1110 where the next frame is read.
The data record processing routine 1120 is described in
Next, the acceleration values are converted into a floating point form at step 1212. Gain and offset errors can be corrected in this step. This routine also computes the magnitude of the acceleration as the square root of the sum of the squares of the three components.
Step 1214 calculates the current position from the acceleration data. With reference to
In the following short program, the values of acceleration are “ac->p.f” for the y value of acceleration and “ac->p.l” for the X value of acceleration (the z value is not needed). This program computes PØ and RØ, and it acquires the initial yaw value as WØ.
Acquisition of rØ and pØ values allows the definition of a reference frame for integration in which the transformed X and Y acceleration components have no contribution from the gravitational force. X and Y in the reference frame are perpendicular to the direction of the gravity vector, while Z in the reference frame is parallel to the gravity vector.
GetRollPitchZero also averages PØ and RØ readings on all subsequent calls after the first call to achieve better and better estimates for these values.
At step 1320, since the only acceleration is that of gravity, we assume that the camera is motionless, and we arbitrarily set the velocity values in all three directions to zero. This assumption holds because the camera is hand-held and is never stable when the human carrier is in motion. If the camera were mounted upon some conveyance that can move very smoothly at a uniform velocity, then this assumption would not hold, and some additional data indicating the velocity of the camera would have to be recorded. In an airplane or automobile, for example, the speedometer reading or ground speed reading could be recorded to assume that this algorithm functions properly.
Next, at step 1318, the current pitch, yaw, and roll are transformed into coordinates that indicate these parameters relative to the newly-defined reference frame. This step 1318 is always performed regardless of whether the magnitude of the acceleration matches that of gravity.
To facilitate, the following computations, the yaw, pitch, and roll values, which are presently referenced to a horizontal plane, must be converted into what is called the quaternion form. This is a four dimensional vector with three imaginary components and one real component. This is done to facilitate the transformation of the acceleration values, which are presently referenced to the tilted camera plane, into valves referenced to the horizontal reference plane (just defined, preparatory to integrating the acceleration values to produce velocity and displacement values.
This calculation is performed by the following program. In this program, the input variable is a record “AttitudeRecord” which contains yaw “w”, pitch “p”, and roll “r”. The returned quaternion values are “s”, “i”, “j”, and “k”, where “s” is the real value and the others are the imaginary values.
At step 1324, the acceleration vector is transformed from the camera body coordinate frame into the stable reference frame.
- //rotate accel vector to platform coordinates
- //using inverse quaternion
- posqInverse.q.s=posqFwd.q.s;
- posqInverse.q.i=−posqFwd.q.i;
- posqInverse.q.j=−posqFwd.q.j;
- posqInverse.q.k=−posqFwd.q.k;
- QuaternionRotate (&pos->p,&posqInverse.q,&prec);
In the above, the acceleration vector is represented by a three component vector “&pos->p”. The four element quaternion value (computed above) is “posqFwd.q.s”, “-.i”, “-.j”, and “-.k”. In the above routine, this quaternion value is first inverted, giving “posqInverse.q.s.”, etc. Next, this inverted quaternion and the acceleration vector are passed to the “QuaternionRotate” routine which returns the transformed acceleration values in a vector “&prec”.
At step 1326, the integration of the transformed acceleration values is carried out as follows:
-
- dx+=(prec.l)/G;
- dz+=(prec.u)/G;
- dy+=(prec.f)/G;
- x+=dx;
- y+=dy;
- z+=dz.
In the above routine, “dx”, “dy”, and “dz” are the velocity values in the x, y, and z directions. “x”, “y”, and “z” are the distance values. The incoming acceleration values are “prec.l” for the “x” axis acceleration, “prec.f” for the “y” axis acceleration, and “prec.u” for the “z” axis acceleration. Note that the acceleration values are normalized with respect to the value of gravity.
The quaternion coordinate transformation process is carried out by using two cross multiplications, and is illustrated below:
The incoming arguments to this function are a three-dimensional vector “v” that is to be rotated and a four-dimensional quaternion vector “q” that defines the rotation. The three-dimensional vector “v” is first transformed into a four-dimensional vector “vq” with the fourth component “vq.s” set to zero.
First, an inverse “qi” is formed of the quaternion “q”. Next, the incoming vector “vq” is quaternion multiplied by the inverse quaternion vector “qi;”. The result of this multiplication “rq” is then quaternion multiplied by the quaternion vector “qi” Three components of the resulting vector, which is returned as “vq”, are transferred back into the vector “rp” which is returned as the transformed result.
The quaternion multiplication is defined by the following program: void QuaternionMultiply (QuaternionRecord *q,
-
- QuaternionRecord *s,
- QuaternionRecord *r)
The details of the step 1122 in
In addition to writing out the records 775 (
An alternative method of position estimation using the inertial platform is now described. The accelerometer inputs described above as the vector “pos” in the program “acc2pos.c” is replaced by a vector of constant velocity as shown in this program fragment:
This sets the velocity vector to one of two values, depending upon the magnitude of instantaneous acceleration being experienced. The vector component “pos->p.f” is the component pointing in the direction of the camera.
If the magnitude is below the threshold G+ or −DELTA_MAG (the force of gravity plus or minus a small deviation, determined by empirical measurement), the camera is assumed to be at rest. If the magnitude is outside this range the velocity is set to be the average velocity of a person walking. The camera is pointed in the direction the operator is walking whenever the operator moves, and this velocity vector is transformed using, the same quaternion as the acceleration measurement above.
The velocity vector is rotated in the following code fragment:
-
- QuaternionRotate(&pos->p.&posqInverse.q.&prec);
The position is then calculated by summing each resultant component:
-
- x−=prec.l;
- y+=prec.f;
- z+=prec.u;
The full text of the alternative “acc2pos.c” file is given in Appendix F. This file is to be substituted for the “acc2pos.c” file listed in Appendix B.
In
The velocity estimate is then transformed from platform coordinates to reference coordinates in box 3380. The resulting transformed velocity is summed component-wise to produce the position estimate in box 3390.
These boxes in
Referring now to
As shown in
In
The rangefinder 485 is shown connected to the CPU 480 by means of two serial communication lines, an outgoing serial communication line TXA1 carrying commands from the CPU 480 to the rangefinder 485, and a return serial communication line RXA1 carrying serial information from the rangefinder 485 back to the CPU 480. The rangefinder 485 returns gathered information periodically, at its own rate of speed. The CPU 480 formulates a range packet 520 (
The two gyroscopes, the directional gyroscope 400 and the vertical gyroscope 410, are designed so that when they are deprived of power, they return to rest positions with the vertical gyroscope 410 having its axis vertically disposed and with the directional gyroscope 400 having axis horizontally disposed.
When the camera 100 is placed into operation, the central processing unit 480 causes the control logic 490 to generate a GYRON signal and to feed it to a gyroscope regulated power supply 420. In response, the gyroscope power supply 420 generates a plus 10 volt, regulated +GYRO signal which feeds power to both of the gyroscopes 400 and 410. In response, the gyroscope motors begin to spin so their axis are stabilized and so that the gimbals associated with the gyroscopes begin to generate pairs of quadrature modulated signals indicating the rotational motions of the tracking data acquisition unit 105.
The directional gyroscope 400 generates two square wave signals in quadrature as the platform 415 is rotated about a vertical axis to the left or to the right. These quadrature signals, which may be called collectively the yaw signal, appear on the two wires DOA and DOB. These signals arise from sensors associated with the gimbals within the gyroscope 400 in response to rotation of the gimbals.
The vertical gyroscope 410 is similarly equipped with two sets of sensors associated with its gimbals to generate pitch and roll quadrature modulated signals. The pitch signal, which appears on the two wires V1A and V1B, indicates the rate at which the camera 100 is pointing more upwards towards the ceiling or more downwards towards the floor. The roll, signal which appears on two wires V0A and V0B, indicates the rate at which the camera is tilting to one side or to the other, away from or towards the vertical.
These quadrature modulated pairs of signals require brief explanation. Assume for the moment that the camera is being rotated horizontally from left to right. This will cause a yaw signal to appear on the two wires D0A and D0B. Each wire bears a square wave signal, and the square waves are at quadrature with each other. This means that a negative going transition of the first of the square wave signals is followed by a negative going transition of the second of the square wave signals. Likewise, a positive going transition of the first signal is always followed by a positive going transition of the second signal. The speed of these transitions indicates the speed with which the camera is being rotated from left to right. If the camera motion stops, then the signals remain stationary until camera motion proceeds once again.
If the direction of rotation is reversed, then again square wave signals are generated—but this time in the opposite phase of quadrature. Thus, if a left-to-right motion causes a first signal to make its transitions ahead of the second signal, then a right-to-left motion will cause the second signal to make its transitions ahead of the first signal. This is identical to the way in which the motion signals work in a mouse pointing device of the type commonly used with digital computers.
The pitch and roll signals, represented respectively by the V1A-V1B and by the V0A-V0B signal lines, operate in a manner identical to the yaw signal just described, with the information being conveyed by quadrature modulated square waves. The three pairs of quadrature modulated signals V1A, V1B, V0A, V0B, and D0A, D1B are fed into respective quadrature decoders 450, 455 and 460. The quadrature decoders 450, 455 and 460 are conventional models, in this case Hewlett Packard Model No. HCTL 2020. The three accelerometers 435, 440, and 445 are shown each generating an analog accelerometer output signal SIG0, SIG1, and SIG2 and also a temperature signal TEMP0, TEMP1, and TEMP2. These signals flow into the multiplexer and A-to-D converter 465.
In the manner described above, the central processing unit 480 is enabled to obtain data indicating the distance of the subject from the rangefinder 485; it is enabled to obtain pitch, roll, and yaw data values from the vertical and directional gyroscopes 400 and 410: and it is enabled to obtain data defining the instantaneous acceleration of the tracking data acquisition unit 105 from the accelerometers 435, 440 and 445 in all three coordinate directions. The CPU 480 continuously packages this information, as explained in steps 600 to 615 in
The shift register and encoder 475 converts the signal into a modulated serial tracking data signal which is presented over the audio data line 115 to the audio record input 125 of the VCR 130.
To summarize the operation of the tracking data acquisition unit 105 as shown in
It then combines all of this information into a packet, with each byte in the packet containing a “1” start bit, a nibble of data, and three trailing “0” stop bits, with a 3 byte header 530 and 550 such as those shown at 535 or 540 in
Upon playback, the video signal 150 flows directly into the personal computer 185. The modulated tracking data signal 145 flows to a demodulator 155. Following demodulation, the unmodulated tracking data signal flows over a signal line 170 to the PC 185.
The program 2406 begins at 2408 by initializing the system. Initialization includes such steps as turning on the power supply 420 for the gyroscopes, programming and initializing the three quadrature decoders 450, 455, and 460, and setting up the analog to digital converter (within the block 465 in
Next, at 2050, data is gathered from the rangefinder 485. At 2052, the analog to digital converter is commanded to gather from the accelerometers 435, 445, and 440 the current acceleration of the tracking data acquisition unit 105.
At step 2054, the above data, together with previously stored values of yaw, roll, and pitch are combined into a data packet, with 4 bits of data per byte, as has been explained above (steps 600, 605, 610, and 615 in
At step 2056, the resulting data packet is placed into RAM memory as a series of bytes to await transmission to the shift register and encoder 475.
At step 2058, the interrupt driven program 2402 is placed into service to transmit the bytes of data to the shift register and encoder 475.
At step 2060, the program 2406 tests to see if all of the data bytes have been transmitted. Alternatively, the program simply suspends itself until a fixed time interval has expired. In either case, after transmission is completed or after the expiration of the time interval, program control recommences with step 2050.
The program 2406 thus continuously operates to assemble data packets and to transmit them to the shift register and encoder 475 where the data is modulated onto an audio signal suitable for recordation on the audio soundtrack of a VCR.
The program 2402 is an interrupt driven program placed into operation by the interrupt signal INT1 every time the shift register and encoder 475 successfully transmits a byte of information. At 2062, this program simply sends a data byte from RAM to the shift register and encoder 475 until there are no more data bytes that remain to be transmitted. This interrupt driven routine frees up the program 2406 from the time consuming task of continuously monitoring for when to transmit the next byte of data.
The program 2404 is also an interrupt driven program. It is placed into operation every time one of the three quadrature decoders receives a signal fluctuation from one of the gyroscopes. In response to INT2, this program gathers pitch, roll, and yaw values from the two gyroscopes and stores them in RAM for later transmission at step 2064.
Three display programs are available for viewing the information stored in the positional database.
The spatial database program 335 (
The movie mapper program 325 is described in detail below.
The positional frame retrieval program 330 allows the user to identify a region in space on the camera path using the WORLD TOOLKIT. The program defines a rectangular solid in space about each imaged object. The location of each item imaged is computed from the data in each record. All the image records where the location of the item image falls outside of the rectangular solid for an item are discarded, and all remaining records are displayed as a movie view of the item from different perspectives. In essence, all available frames showing the item or the volume indicated are displayed sequentially, giving all the views available of the desired item or volume.
The data tracking and acquisition unit and modulator 105 has illustratively been shown connected to a rangefinder, receiving therefrom a serial data signal which is conveyed to the central computer for possible use in a data retrieval system. The positional frame retrieval program 330 uses this range data to determine the position of imaged objects and to retrieve all images containing a designated object, as contrasted with the movie mapper program technique of retrieving images based upon camera position and orientation (described below).
For example, in
The range data is preserved in the positional database 322 for this embodiment of the invention. Accordingly, the retrieval program 330, in response to a mouse click at the position of the object, may locate all records 775 which, from position, camera angle, and distance to subject, relate to images of that object. These may be grouped into an object database and displayed as a video movie, as described above.
In addition to a rangefinder, other devices may be attached to the serial data input of the tracking data acquisition unit and modulator 105, and data may be captured from these other devices. For example, data may be captured from a gas chromatograph or from a chemical sniffer. Sound may be recorded from a microphone. Average light intensity may be recorded from a photo cell. Infrared records of motion in the vicinity may be recorded. Data may be gathered from stationary machinery or data collection devices, such as flow meters or temperature sensors. Any type of data that has a spatial reference may be gathered in this manner.
Another embodiment of the invention is shown in
In this embodiment, the tracking data acquisition unit and modulator 105 is not attached to the camera, but is attached by a serial communicator tether 3102 (which could include a radio linkage) directly to the serial port of the computer 185, such that the tracking and video data entry and storage program continuously accepts the data and stores it in the tracking database 324. The tracking data acquisition unit is thus enabled to be used as a mouse pointing device for the positional frame retrieval program 330 to guide in the retrieval and display of images—an inertial joystick.
The tracking database 324 may be on the hard disk as previously, but preferably it is a RAM circular buffer that is shared by the tracking database to positional database conversion program 310. Alternatively, the tracking data values may be sent as messages between the two programs running under Windows and subject to the standard Windows message dispatcher (not shown). Both of the programs 305 and 310 can be simplified, since neither is dealing with video or with frame numbers in this embodiment.
The tracking database to positional database conversion program operates continuously, receiving tracking data containing unnormalized acceleration and orientation data and converting it into normalized position and orientation data, and sending the resulting data directly to the positional frame retrieval program 330, thereby causing the program 330 to update the display continuously in response to manual movement of the tracking data acquisition unit and modulator 105 through time and space.
As an alternative, two computers can be utilized. A first portable computer (not shown) can be attached to the tracking data acquisitional unit and modulator 105 can contain the program elements 305, 324, 310 and 330, shown in
As a third alterative, the tracking data acquisition unit and modulator 105 in
The arrangement illustrated in
Another embodiment of the invention utilizes a movie mapper 325 (
The program 325 is written in Microsoft C++ to run under Microsoft Windows, Version 3.1. Some of the programs call upon routines contained within the Microsoft Video For Windows Development Kit, Version 1.1. All of the above may be obtained from Microsoft Corporation, 1 Microsoft Way, Redmond, Wash. 98052. The Media Control Interface “MCIWind” 10944 (
With reference to
The program 325 opens up the selected “*.AVI” file. It also looks for and attempts to open a telemetry file, containing the positional information, which has the same name and the file name suffix “*.TLA”. The “*.TLA” file must be prepared for this program by adding to the beginning of the file a single line of text. The line is “HDR<tab ><last frame number>” where <tab> is the tab character, and <last frame number> is the first number on the last line of the file. If no “*.TLA” file is found, the system issues an error message but permits one to browse through the “*.AVI” file in the normal Microsoft manner. The program also looks for and attempts to open an “*.MTA” file for its own use in defining overlay characteristics. An empty “*.MTA” file should be supplied, since the program will issue an error message and quit if none is found. The File popdown menu contains the usual list of Microsoft options, including Open AVI, Print, Print Preview, Print Setup, Exit, and a numbered list of recently opened AVI files to facilitate recalling a file recently opened.
The figures illustrate what happens when an AVI file BACK.AVI and its corresponding telemetry file BACK.TLA were successfully opened. Upon opening this file, the program 325 causes a Video child window 10110 to be opened, displaying in this case a view of the back yard of a private home. This video window is labelled at the top with “BLACK.AVI—Video” to identify both the file and the fact that this is the child window. The child window 10110 provides controls that are standard Microsoft controls for scrolling through a sequence of video images. These include a VCR-like control 10108. Clicking on this control starts the video playback if it is stopped, and stops it if it is running. The control is marked with a square when the video is playing, and it is marked with a triangle (shown) when the video is stopped. The Video popdown menu contains selections for changing the size of the video window to the original size, half the original size, or double the original size. A mouse-actuated slider control 10103 (
A View pulldown menu permits additional child windows to be opened that relate to the position information. The View menu provides the following options, which are described below:
-
- Triple MCI
- Plan
- Elevation
- Yaw
- Pitch
- Roll
- Orientation
- Overlay Visibility
- Toolbar
- Status
Actuating the Plan command causes a plan view: XY child window 10120 to open up, displaying the path 10123 over which the camera was moved during the process of recording the sequential images. This window is labeled: “BACK.AVI—Plan View: XY” to identify the file name as well as the nature of the view. While not clearly visible in
Activating the Elevation command causes an Elevational View: XZ child window 10130 to open up, displaying the camera path of movement as seen from the side, rather than from the top. The display in all other respects is identical to the plan view display just described.
Activating the Yaw command causes a Yaw View child window 10140 to open up, displaying the various yaw directions the camera assumed during the image recording process. Yaw may be thought of as compass direction.
Likewise, activating the Pitch command causes a Pitch View child window 10150 to open up, displaying pitch directions in a fashion analogous to the Yaw View above. Pitch is inclination of the camera in the vertical image plane, that is looking up and down. In this example, the images do not vary significantly in pitch.
Activating the Roll command opens a Roll View child window 10160, displaying roll directions as above. Roll is the tilt of the image plane from side to side, or leaning.
By clicking twice upon a point on the camera path in any of the child windows 10120, 10130, 10140, 10150, or 10160, the user may signal the program to switch to displaying the image closest to the point where the mouse pointer was clicked and having the desired position, angle, or orientation. This selection process provides a rapid method for retrieving and positioning the video for playback based upon the desired position of the camera when an image was captured.
Activating the Orientation View command causes a child window 10180 to open up, displaying the orientation of the camera (pitch, roll, and yaw) graphically. Unfortunately, the absence of colors in
The actual direction of the camera, when it captured the image shown in the child window 10110, is indicated by the line 10183, which appears half in red and half in blue in the actual program display (and does not match the bold black line 10189). The dotted line 10184 is a projection of the camera direction line 10183 onto the X-Y plane. As a sequence of images is displayed in response to actuation of the play control 10108, the red-blue line 10183 swings about, indicating the orientation of the camera, and the dotted line 10184 swings about below (or above) the line 10183 like a shadow at noontime on the equator.
Roll is represented by the ratio of blue and red portions of the line 10183. If the roll value is zero, then the line is half red and half blue. Roll in a clockwise direction increases the blue, while roll in a counterclockwise direction increases the red. A positive roll gives more red.
A toolbar containing push button controls (10141, 10142, etc.) appears in the window 10101 and may be selectively displayed or hidden by actuating the Toolbar command in the View pulldown menu. The push button 10141 is an alternate way to open a file. The push buttons 10142 and 10145 respectively wind to the first and the last frames, and the push buttons 10143 and 10144 respectively move back and forward by one frame at a time.
The push button 10146 zooms in, and the push button 10147 zooms out. These two push buttons control the portion of the video path that is displayed by the Plan View and Elevational View child windows 10120 and 10130, and the orientations shown in the Yaw View, the Pitch View, and the Roll View. To zoom in, and with reference to
The user may also create a rectangle within the Elev: XZ window, if desired. When the zoom-in push button 10146 is actuated, the selected rectangle fills the XZ window, and the X axis of the XY window is zoomed in the X axis only. Points not within the XZ selection rectangle are then excluded from the elevational view.
The user may also create a rectangle in the Yaw View window 10140, the Pitch View window 10150, or the Roll View window 10160, if desired. When the zoom-in push button 10146 is actuated, the selected rectangle fills the window, and the Yaw, Pitch, or Roll values displayed within the selected rectangle expand to fill the display. Points in the Plan View: XY window 10120 and the Yaw View window 10140 which do not have yaw, pitch, and roll values within the resulting displayed Yaw View. Pitch View, and Roll View windows are excluded from the Plan View window and Elev XZ window.
Actuation of the zoom-out push button 10147 causes the plan and elevational displays to return to their original appearance (as shown in
Actuating the Overlay Visibility . . . menu selection from the View menu causes the display of an Overlay Visibility dialog window 11110 (
The Overlay items listed above will be described later.
The final command in the View popdown menu is the Status command, which hides or displays a status line at the bottom of the window 10190 and which indicates such things as the frame number and the X, Y, Z, Yaw, Pitch, and Roll coordinates of the camera corresponding to the image that is being displayed.
The Frame pop down menu displays the following commands, all of which relate to simply positioning the video as if it were a continuous video tape recording:
-
- Home
- Back
- Forward
- End
Additionally, the Frame pop down menu displays the following commands which have a special meaning:
-
- Loop Forward
- Loop Backward
The Loop Forward and Loop Backward selections cause the Video display to update with the next frame of video (Loop Forward) or previous frame (Loop Backward) in temporal sequence, continuously. Upon reaching the end of the video sequence, the sequence continues playback with the frame at the other end of the sequence, as though the video were joined in a continuous loop of film. Frames not shown in the Plan View window are skipped in the playback. Also, after each frame is shown, the Overlay Items appropriate to the frame are rendered onto the video frame display. The Overlay Items are described below.
The Loop Forward function may alternatively be started and stopped by striking the space bar. Each strike toggles the Loop Forward function between the started and stopped state.
The Action Menu has functions corresponding to the Toolbar buttons Zoom In 10146 and Zoom Out 10147. It also has functions corresponding to the Head Forward button 10148 and Head Back button 10149. Finally the Adjust Parameters function brings up a dialog box 10310 (
The Head Forward 10148 and Head Back 10149 buttons and functions allow the user to move forward or back up from the currently displayed frame. The Head Forward frame and Head Back frame are shown as green dots 10123 and 10124 on the Plan View window, and as green rectangles 10420 and 10422 (
The Head Forward frame is chosen by the program such that when it is displayed the appearance to the user is that the user has moved forward. This frame is selected in the following manner. Search all the frames in the telemetry file (*.tla) and determine which are within the front field of view angle from the perspective of the current frame.
The front field of view is defined here. Define a ray with its origin at the current frame's x-y location and extending in the direction of the yaw angle, called the centerline of the frame. Define a pair of rays with the same origin, called left and right lines of the frame, which make an angle with the centerline specified by the user in the Action-Adjust Parameters dialog (
Of all the frames within the field of view (fov), determine which are within an operator specified radius (the Action-Adjust Parameters dialog Neighborhood Multiplier item 10320 (
-
- 1. the absolute angular difference between the yaw angles of the current frame and the candidate frame, and
- 2. a weighted rating based on the distance of the xy location of the candidate frame from any point on the neighborhood circle around the current frame.
The weight given this distance error is proportional to the value of the Action-Adjust Parameters dialog Selecting Side Angles scroll bar item 10340 (
Sometimes there is not an appropriate frame. In that case, none are selected and no corresponding overlay is displayed.
The Head Back frame is calculated in an similar manner to that used to the Head Forward frame with the exception that this frame must lie in the rear field of view. The rear field of view is a yaw angle range that can be found by reflecting the front field of view angle range to the back of the current frame. These fields of view are the same magnitude and face in opposite directions. For the case of jumping backward, the yaw angle goal is equal to the yaw angle of the current frame.
The keyboard Up-arrow-key and letter I key also execute the Head Forward function, the keyboard Down-arrow-key and letter K key also execute the Head Back function.
The View menu-Triple Mci function causes the MCIview window to expand to include three video windows (
If no frame is found that meets the criteria for selection, no frame is displayed. If a frame has been chosen for the left frame or the right frame, that frame is displayed within its own MCIwnd, located to the left or the right of the center frame.
The left and right frames are aligned with the center frames based on the fov and on the yaw and pitch values of the centerlines of the frames. The fov defines a “pixel per degree” displacement on the screen. For each degree the left or right frame's centerline disagrees with the predicted centerline (center frame's centerline plus or minus twice the fov) the frame is shifted a corresponding number of pixels. Likewise, the frame is shifted up or down by the pixel per degree amount by which the frame's pitch differs from the center frame's pitch.
The left arrow key and the J key on the keyboard select a new frame for viewing. Initially the frame selected is the displayed left frame described above. The 0 through 9 keys allow the selection of a frame (if one exists) which has a centerline less than twice the fov angle. Pressing the 1 key sets the goal centerline angle to the centerline angle of the current frame plus or minus 10 percent of twice the fov angle. The 2 key similarly sets the goal centerline angle to 20 percent of twice the fov angle. The 0 key sets the goal centerline angle to 100 percent of twice the fov angle. Note that this setting yields the frame which precisely abuts the center frame, since points on the edge of the center frame lie on a vertical plane extending from the current point of view which makes an angle of exactly “fov” degrees with the current frame centerline.
An overlay item represents a point or area of interest of the database as projected onto the current frame. Several overlay item types are defined by the movie mapper program. The overlay items which represent frames are the jump forward rectangle 10420 (
Additionally, one or more frames may be defined as Entrance frames, marked on the Video Overlay window with an Entrance item 11230 (
Three overlay item types describe areas rather than points. An Exit item marks an area of the Plan View containing frames of the current database which when viewed will cause the display of an associated Entrance frame. A Launch Pad area item 10722 (
An Exit area is defined by the operator's actions:
The operator clicks on the Plan View window, holding the left mouse button down and moving the pointer to another point within the plan view window. This action creates a rectangle drawn on the Plan View 10122 (
A Launch Pad item is defined in a similar fashion. The operator clicks and drags on the Plan View window as above, and then invokes the Objects-Create Launch Pad dialog 10600 (
Each of these overlay items is projected onto the current video frame according to the overlay calculation code described above.
The Window pop down menu provides conventional Microsoft Windows window control functions that need not be described here. The Help popdown menu also needs no special explanation.
The object program structure of the movie mapper 325 is described in overview in
The operating program is composed of a number of objects each of which is an instance of some class of objects. These objects are represented by rectangles in
When the program commences operating, it first appears as an object named CVidApp 10902 (
This CMainFrame object next launches a child frame derived from Microsoft's CMDIChildWnd that is of the class CMDITextWnd 10938. This object gives rise to a child window that is modified from the Microsoft original in that its title line (positioned above the window) may be dynamically altered while the program is in operation.
This first child object launches within itself an instance of the CMCIView 10940 object, which launches MCIWnd 10944 to display the sequence of video images.
The CMciView object 10944 also launches a CMCIWnd object 10950. The CMCIWnd object 10950 attaches itself to the MCIWnd object 10944 so that Windows operating system events come to the CMCIWnd 10950 object instead of being sent directly to the MCIWnd object 10944. In most cases the CMCIWnd 10950 object merely forwards the events to the MCIWnd object 10944. The CMCIWnd object 10950 intercepts mouse events, so that it may sense operator clicks on the overlay objects. The CMCIWnd object 10950 also intercepts MCIWnd redraw requests, forwards these requests to MCIWnd 10944 for video refresh, and then redraws the overlay objects associated with the video frame being displayed.
The CMciView and MCIView objects are the lower-most of the seven objects shown in
Six more windows may be opened by the user through activation of the View pop down menu, as has been described. Each of these windows corresponds to an additional pair of nested objects which appear within the CMainFrame object 10905 shown in
If the user opens the Plan View: XY window 10120 (
If the user opens the Yaw View window 10140, then the object CMDITextWnd 10918 containing the object CYawView 10910 is created within the CMainFrame object 10905, and this pair of objects together create the child window shown at 10140 in
If the user opens the Pitch View window 10195, then the object CMDITextWnd 10923 containing the object CPitchView 10925 is created within the CMainFrame object 10905, and this pair of objects together create the child window shown at 10195 in
If the user opens the Roll View window 10160, then the object CMDITextWnd 10923 containing the object CRollView 10915 is created within the CMainFrame object, and this pair of objects together create the child window shown at 10160 in
As the user closes the windows, the corresponding pairs of objects are destroyed.
The objects, once created, send messages back and forth to each other over the paths illustrated in
When the user “talks” to the program, using the keyboard or mouse, in general the user communicates with the active window (in the case of the keyboard) or the window that the mouse is in (in the case of the mouse), selecting a different active window by clicking within the window boundaries. The nine windows shown in
The central coordinating object is the document object, which is an instance of CVidDoc 10927 which is derived from the class Cdocument (a Microsoft class). This object contains the key system variables that determine the state of the system. Included, as illustrated in
If any window receives a message that calls for adjustment of one of the key system variables, that message is sent immediately to the document object CVidDoc 10907. The value of the key variable is adjusted, and then the document object 10907 broadcasts an “Update All Views” message over the path 10980 to the MCIWnd child window object 10944, over path 10911 to object 10910, over path 10921 to object 10920, over path 10926 to object 10925, over path 10916 to object 10915, over path 10936 to object 10935, over path 10941 to object 10940, and over path 10946 to object 10945. Each responds accordingly. The “Update All Views” message contains a hint that says either:
-
- 1. Rewrite Everything, or
- 2. Just Do Current Frame.
The “rewrite everything” hint causes each window to be redrawn completely. The “just do current frame” hint causes, for example, just one point to be changed from blue to red.
The individual window objects next communicate with the document object 10907 over paths 10911, 10916, 10921, 10926, 10936, 10941, and 10946 and receive information via paths 10912, 10917, 10922, 10927, 10937, 10942, and 10947 to learn of the new state of the system and to retrieve from the document object 10907 whatever data they need to update their respective windows. For example, if the user at 10902 clicks upon the “zoom out” push button 10146 (
Double clicking on the path in one of the windows 10110, 10120, 10130, 10140, 10150, 10160, and 10180 (
When the user activates the video controls within the child window 10110 (
The program provides the facility of navigating through the video database by selecting frames of data from the database which represent translations and rotations from the position of the current frame's point of view. The program searches the database for frames which best meet the criteria, and the operator then selects one of these frames using the keys or mouse as described above. The following paragraphs describe the method the program uses to select the best frame for each of the possible operator choices (turn left, turn right, jump forward, jump backward, jump forward and simultaneously turn right, jump forward and simultaneously turn left).
First, the program initializes variables which are used to keep track of the best entry found for each of the selections above. Cw_slope is clockwise, and ccw_slope is counterclockwise. Fov is the slope data for the current frame. LeftDiag is the slope data for the frame to the left of the current frame, its centerline close to twice the fov counterclockwise of the current frame's centerline. RightDiag is the slope data for the frame to the right of the current frame, its centerline close to twice the fov clockwise of the current frame's centerline.
While the preferred embodiment of the invention has been described, it will be understood that numerous modifications and changes will occur to those skilled in the art. It is therefore intended by the appended claims to define the true scope of the invention.
Claims
1. A spatially referenced photographic system comprising:
- a data base containing plural images of objects and also containing information corresponding to said images defining the position at which each image was originally viewed and the orientation of the image with respect to that position;
- image presentation and navigation means for displaying the images to a user and for facilitating the user in navigating among said images by receiving spatial movement commands from the user, as indicated by said spatial movement commands;
- wherein said image presentation and navigation means includes means for displaying, along with an image, a view of the camera path and an indication of the camera position and orientation when the image was recorded;
- wherein camera position and orientation is indicated by a mark on the path oriented as the camera is oriented to point where the camera was pointing; and
- wherein the view is a plan view and wherein the mark bears an indication thereon of the yaw angle of the camera.
2. A spatially referenced photographic system comprising:
- a data base containing plural images of objects and also containing information corresponding to said images defining the position at which each image was originally viewed and the orientation of the image with respect to that position;
- image presentation and navigation means for displaying the images to a user and for facilitating the user in navigating among said images by receiving spatial movement commands from the user, as indicated by said spatial movement commands;
- wherein said image presentation and navigation means includes means for displaying, along with an image, a view of the camera path and an indication of the camera position and orientation when the image was recorded; and
- wherein a mark appears in said image of a location associated with another image such that the user may signal a desire to navigate forward to view said another image in a simple manner.
3. A spatially referenced photographic system in accordance with claim 2 wherein the path also bears an indication of the location of said another image.
4. A spatially referenced photographic system comprising:
- a data base containing plural images of objects and also containing information corresponding to said images defining the position at which each image was originally viewed and the orientation of the image with respect to that position;
- image presentation and navigation means for displaying the images to a user and for facilitating the user in navigating among said images by receiving spatial movement commands from the user, as indicated by said spatial movement commands; and
- wherein said image presentation and navigation means provides the user with navigation controls for moving backward, in response to the actuation of which controls said means selects an image captured at a generally backward camera position having an orientation similar to that of an image the user is currently viewing.
5. A spatially referenced photographic system comprising:
- a data base containing plural images of objects and also containing information corresponding to said images defining the position at which each image was originally viewed and the orientation of the image with respect to that positions;
- image presentation and navigation means for displaying the images to a user and for facilitating the user in navigating among said images by receiving spatial movement commands from the user, as indicated by said spatial movement commands; and
- wherein said image presentation and navigation means provides the user with navigation controls for rotating left or right, in response to the actuation of which controls said means selects an image captured at a generally left-rotated or right-rotated camera position having a position fore-, aft-, and side-to-side similar to that of an image the user is currently viewing.
6. A spatially referenced photographic system in accordance with claim 5 wherein the image which the user is currently viewing bears marks indicating left and right possible rotations which thereby indicate the general location of the viewpoint of said image captured at said generally left-rotated or right-rotated camera positions.
7. A spatially referenced photographic system in accordance with claim 6 wherein the user signals a desire to move generally left or right by mouse clicking on said indicating marks.
8. A spatially referenced photographic system comprising:
- a data base containing plural images of objects and also containing information corresponding to said images defining the position at which each image was originally viewed and the orientation of the image with respect to that position;
- image presentation and navigation means for displaying the images to a user and for facilitating the user in navigating among said images by receiving spatial movement commands from the user, as indicated by said spatial movement commands; and
- wherein said image presentation and navigation means provides the user with navigation controls for moving forward and simultaneously rotating to the left or to the right, in response to the actuation of which controls said means selects an image captured at a generally forward camera position having an angular orientation rotated to the left or to the right of that of an image the user is currently viewing.
9. A spatially referenced photographic system comprising:
- a data base containing plural images of objects and also containing information corresponding to said images defining the position at which each image was originally viewed and the orientation of the image with respect to that position;
- image presentation and navigation means for displaying the images to a user and for facilitating the user in navigating among said images by receiving spatial movement commands from the user, as indicated by said spatial movement commands;
- said image presentation and navigation means providing the user with navigation controls, in response to the actuation of which controls said presentation and navigation means selects an image captured at a position generally shifted from that of an image the user is currently viewing, as indicated by the user actuation of said controls.
10. The spatially referenced photographic apparatus of claim 17 wherein the overlay item comprises a frame, wherein the computing device is configured to provide a new image corresponding to the overlay item in response to receiving signals indicating selection of the frame.
11. The spatially referenced photographic apparatus of claim 17 wherein the computing device is configured to provide a designation on at least one of the displayed image and a plan view of a position at which the displayed image was captured and a yaw orientation of the displayed image with respect to that position.
12. A spatially referenced photographic apparatus comprising:
- a data base containing images of objects and information corresponding to the images, the information defining a position at which a respective image was captured and at least a yaw orientation of the respective image with respect to the position, wherein the information is derived from a camera position and orientation information automatically recorded substantially simultaneously with recording of the respective image;
- a computing device in communication with the data base, the computing device configured to receive spatial movement commands and provide images for display in response to receiving the spatial movement commands via an interface;
- wherein the computing device is configured to provide an overlay item projected onto a displayed image based on at least one of the respective images from the data base, the overlay item comprising an indication of a point or area of interest for the displayed image with respect to which one or more associated images are available in response to receiving signals indicating selection of the overlay item; and
- wherein the computing device is configured to provide, along with the displayed image and the overlay item projected onto the displayed image, navigation controls for moving forward and simultaneously rotating to the left or to the right, the computing device configured to provide in response to actuation of the navigation controls a selected image captured at a generally forward camera position having an angular orientation rotated to the left or to the right of that of the displayed image.
13. A spatially referenced photographic apparatus comprising:
- a data base containing images of objects and information corresponding to the images, the information defining a position at which a respective image was captured and at least a yaw orientation of the respective image with respect to the position, wherein the information is derived from a camera position and orientation information automatically recorded substantially simultaneously with recording of the respective image;
- a computing device in communication with the data base, the computing device configured to receive spatial movement commands and provide images for display in response to receiving the spatial movement commands via an interface;
- wherein the computing device is configured to provide an overlay item projected onto a displayed image based on at least one of the respective images from the data base, the overlay item comprising an indication of a point or area of interest for the displayed image with respect to which one or more associated images are available in response to receiving signals indicating selection of the overlay item; and
- wherein the computing device is configured to provide, along with the displayed image and the overlay item projected onto the displayed image, navigation controls for moving backward, the computing device configured to provide in response to actuation of the navigation controls a selected image captured at a generally backward camera position having an orientation similar to that of the displayed image.
14. A spatially referenced photographic apparatus comprising:
- a data base containing images of objects and information corresponding to the images, the information defining a position at which a respective image was captured and at least a yaw orientation of the respective image with respect to the position, wherein the information is derived from a camera position and orientation information automatically recorded substantially simultaneously with recording of the respective image;
- a computing device in communication with the data base, the computing device configured to receive spatial movement commands and provide images for display in response to receiving the spatial movement commands via an interface;
- wherein the computing device is configured to provide an overlay item projected onto a displayed image based on at least one of the respective images from the data base, the overlay item comprising an indication of a point or area of interest for the displayed image with respect to which one or more associated images are available in response to receiving signals indicating selection of the overlay item; and
- wherein the computing device is configured to provide, along with the displayed image and the overlay item projected onto the displayed image, navigation controls for rotating left or right, the computing device configured to provide in response to actuation of the navigation controls for rotating left or right an image captured at a generally left-rotated or right-rotated camera position having a position forward, backward, and side-to-side similar to that of the displayed image.
15. A spatially referenced photographic apparatus comprising:
- a data base containing images of objects and information corresponding to the images, the information defining a position at which a respective image was captured and at least a yaw orientation of the respective image with respect to the position, wherein the information is derived from a camera position and orientation information automatically recorded substantially simultaneously with recording of the respective image;
- a computing device in communication with the data base, the computing device configured to receive spatial movement commands and provide images for display in response to receiving the spatial movement commands via an interface;
- wherein the computing device is configured to provide an overlay item projected onto a displayed image based on at least one of the respective images from the data base, the overlay item comprising an indication of a point or area of interest for the displayed image with respect to which one or more associated images are available in response to receiving signals indicating selection of the overlay item; and
- wherein the displayed image with the overlay item projected onto the displayed image includes designations indicating left and right possible rotations corresponding to a general location of a viewpoint of an image captured at generally left-rotated or right-rotated camera positions.
16. The spatially referenced photographic apparatus of claim 15 wherein the computing device is configured to provide the image captured at generally left-rotated or right-rotated camera positions in response to receiving signals indicating selection of one of the designations.
17. A spatially referenced photographic apparatus comprising:
- a data base containing images of objects and information corresponding to the images, the information defining a position at which a respective image was captured and at least a yaw orientation of the respective image with respect to the position, wherein the information is derived from a camera position and orientation information automatically recorded substantially simultaneously with recording of the respective image;
- a computing device in communication with the data base, the computing device configured to receive spatial movement commands and provide images for display in response to receiving the spatial movement commands via an interface;
- wherein the computing device is configured to provide an overlay item projected onto a displayed image based on at least one of the respective images from the data base, the overlay item comprising an indication of a point or area of interest for the displayed image with respect to which one or more associated images are available in response to receiving signals indicating selection of the overlay item; and
- wherein the computing device is configured to provide, along with the displayed image and the overlay item projected onto the displayed image, navigation controls for moving forward and simultaneously rotating to the left or to the right, the computing device configured to provide in response to the actuation of the navigation controls an image captured at a generally forward camera position having an angular orientation rotated to the left or to the right of the displayed image.
18. The spatially referenced photographic apparatus of claim 17 wherein the computing device is configured to provide multiple images aligned in a panorama to provide a wider field of view.
19. The spatially referenced photographic apparatus of claim 17 wherein the computing device is configured to provide a plan view overlay item projected as a region onto the displayed image, the plan view overlay item comprising an indication of a point or area of interest on the plan view with respect to which one or more associated images are available, wherein the computing device is configured to provide a new image corresponding to the plan view overlay item in response to receiving signals indicating selection of the region.
20. The spatially referenced photographic apparatus of claim 17 wherein the computing device is configured to provide a new image corresponding to the overlay item in response to receiving signals indicating selection of the overlay item.
21. A spatially referenced photographic apparatus comprising:
- a data base containing plural images of objects and also containing information corresponding to said images defining the position at which each image was originally captured and at least the yaw orientation of the image with respect to that position, wherein recorded information from which said position and yaw orientation information may be determined was automatically recorded substantially simultaneously with the recording of the image;
- image presentation and navigation means for displaying the images to a user and for facilitating the user in navigating among said images by receiving spatial movement commands from the user, as indicated by said spatial movement commands;
- said image presentation and navigation means providing to the user, for display in combination with the images and an overlay item, navigation controls, in response to the actuation of which controls said presentation and navigation means selects an image captured at a position generally shifted from that of an image the user is currently viewing, as indicated by the user actuation of said controls; and
- said image presentation and navigation means providing the user an overlay item projected onto the image the user is currently viewing, the overlay item comprising an indication of a point or area of interest for the image with respect to which one or more associated images are available in response to receiving signals indicating selection of the overlay item.
3363332 | January 1968 | Wilson et al. |
3580996 | May 1971 | Maxey |
4023202 | May 10, 1977 | Louie et al. |
4084184 | April 11, 1978 | Crain |
4169666 | October 2, 1979 | Slater et al. |
4179823 | December 25, 1979 | Sullivan et al. |
4283766 | August 11, 1981 | Snyder et al. |
4343037 | August 3, 1982 | Bolton |
4373169 | February 8, 1983 | Burkam |
4449198 | May 15, 1984 | Kroon et al. |
4463380 | July 31, 1984 | Hooks, Jr. |
4484192 | November 20, 1984 | Seitz et al. |
4486775 | December 4, 1984 | Catlow |
4488249 | December 11, 1984 | Baker |
4495580 | January 22, 1985 | Keearns |
4561063 | December 24, 1985 | Craig et al. |
4628453 | December 9, 1986 | Kamejima et al. |
4645459 | February 24, 1987 | Graf et al. |
4682160 | July 21, 1987 | Beckwith, Jr. et al. |
4687326 | August 18, 1987 | Corby, Jr. |
4737921 | April 12, 1988 | Goldwasser et al. |
4751507 | June 14, 1988 | Hama et al. |
4752836 | June 21, 1988 | Blanton et al. |
4766541 | August 23, 1988 | Bleich et al. |
4791572 | December 13, 1988 | Green, III et al. |
4807158 | February 21, 1989 | Blanton et al. |
4829304 | May 9, 1989 | Baird |
4855820 | August 8, 1989 | Barbour |
4855822 | August 8, 1989 | Narendra et al. |
4857902 | August 15, 1989 | Naimark et al. |
4876651 | October 24, 1989 | Dawson et al. |
4891761 | January 2, 1990 | Gray et al. |
4910674 | March 20, 1990 | Lerche |
4939662 | July 3, 1990 | Nimura |
4939663 | July 3, 1990 | Baird |
4969036 | November 6, 1990 | Bhanu et al. |
4972319 | November 20, 1990 | Delorme |
4984179 | January 8, 1991 | Waldern |
4989151 | January 29, 1991 | Nuimura |
4992866 | February 12, 1991 | Morgan |
4994971 | February 19, 1991 | Poelstra |
5016007 | May 14, 1991 | Iihoshi et al. |
5023725 | June 11, 1991 | McCutchen |
5023798 | June 11, 1991 | Neukirchner et al. |
5049988 | September 17, 1991 | Sefton et al. |
5060162 | October 22, 1991 | Ueyama et al. |
5072396 | December 10, 1991 | Fitzpatrick et al. |
5073819 | December 17, 1991 | Gates et al. |
5075861 | December 24, 1991 | Hasson et al. |
5086396 | February 4, 1992 | Waruszewski et al. |
5089816 | February 18, 1992 | Holmes, Jr. |
5115398 | May 19, 1992 | De Jong |
5123088 | June 16, 1992 | Kasahara et al. |
5124938 | June 23, 1992 | Algrain |
5128874 | July 7, 1992 | Bhanu et al. |
5133050 | July 21, 1992 | George et al. |
5146212 | September 8, 1992 | Venolia |
5155683 | October 13, 1992 | Rahim |
5166878 | November 24, 1992 | Poelstra |
5177685 | January 5, 1993 | Davis et al. |
5182641 | January 26, 1993 | Diner et al. |
5187571 | February 16, 1993 | Braun et al. |
5189402 | February 23, 1993 | Naimark et al. |
5214615 | May 25, 1993 | Bauer |
5214757 | May 25, 1993 | Mauney et al. |
5227985 | July 13, 1993 | DeMenthon |
5259037 | November 2, 1993 | Plunk |
5262867 | November 16, 1993 | Kojima |
5265025 | November 23, 1993 | Hirata |
5267042 | November 30, 1993 | Tsuchiya et al. |
5268998 | December 7, 1993 | Simpson |
5270694 | December 14, 1993 | Naimark et al. |
5274387 | December 28, 1993 | Kakihara et al. |
5299300 | March 29, 1994 | Femal et al. |
5322441 | June 21, 1994 | Lewis et al. |
5325472 | June 28, 1994 | Horiuchi et al. |
5327233 | July 5, 1994 | Choi |
5335072 | August 2, 1994 | Tanaka et al. |
5381338 | January 10, 1995 | Wysocki et al. |
5384588 | January 24, 1995 | Martin et al. |
5388990 | February 14, 1995 | Beckman |
5392225 | February 21, 1995 | Ward |
5396583 | March 7, 1995 | Chen et al. |
5490075 | February 6, 1996 | Howard et al. |
5495576 | February 27, 1996 | Ritchey |
5512941 | April 30, 1996 | Takahashi et al. |
5523783 | June 4, 1996 | Cho |
5530650 | June 25, 1996 | Biferno et al. |
5555018 | September 10, 1996 | Von Braun |
5559707 | September 24, 1996 | DeLorme et al. |
5563650 | October 8, 1996 | Poelstra |
5568152 | October 22, 1996 | Janky et al. |
5598209 | January 28, 1997 | Cortjens et al. |
5600368 | February 4, 1997 | Matthews |
5601353 | February 11, 1997 | Naimark et al. |
5602564 | February 11, 1997 | Iwamura et al. |
5633946 | May 27, 1997 | Lachinski et al. |
5636036 | June 3, 1997 | Ashbey |
5642285 | June 24, 1997 | Woo et al. |
5644694 | July 1, 1997 | Appleton |
5645077 | July 8, 1997 | Foxlin |
5682332 | October 28, 1997 | Ellenby et al. |
5684943 | November 4, 1997 | Abraham et al. |
5689611 | November 18, 1997 | Ohta et al. |
5703604 | December 30, 1997 | McCutchen |
5729471 | March 17, 1998 | Jain et al. |
5739848 | April 14, 1998 | Shimoura et al. |
5751578 | May 12, 1998 | Quinn et al. |
5764276 | June 9, 1998 | Martin et al. |
5768640 | June 16, 1998 | Takahashi et al. |
5793367 | August 11, 1998 | Taguchi |
5794216 | August 11, 1998 | Brown |
5815411 | September 29, 1998 | Ellenby et al. |
5838906 | November 17, 1998 | Doyle et al. |
5850352 | December 15, 1998 | Moezzi et al. |
5854843 | December 29, 1998 | Jacknin et al. |
5881321 | March 9, 1999 | Kivolowitz |
5897223 | April 27, 1999 | Tritchew et al. |
5913078 | June 15, 1999 | Kimura et al. |
5937096 | August 10, 1999 | Kawai |
6006126 | December 21, 1999 | Cosman |
6011585 | January 4, 2000 | Anderson |
6037936 | March 14, 2000 | Ellenby et al. |
6040824 | March 21, 2000 | Maekawa et al. |
6064355 | May 16, 2000 | Donahue |
6133947 | October 17, 2000 | Mikuni |
6141034 | October 31, 2000 | McCutchen |
6195122 | February 27, 2001 | Vincent |
6233004 | May 15, 2001 | Tanaka et al. |
6282362 | August 28, 2001 | Murphy et al. |
6292215 | September 18, 2001 | Vincent |
6449011 | September 10, 2002 | Muramatsu et al. |
6680746 | January 20, 2004 | Kawai et al. |
6768563 | July 27, 2004 | Murata et al. |
7050102 | May 23, 2006 | Vincent |
7849393 | December 7, 2010 | Henricks et al. |
RE42289 | April 12, 2011 | Vincent |
20010026318 | October 4, 2001 | Yonezawa et al. |
20020067412 | June 6, 2002 | Kawai |
20100250120 | September 30, 2010 | Waupotitsch et al. |
20100302280 | December 2, 2010 | Szeliski et al. |
20100325589 | December 23, 2010 | Ofek et al. |
2058877 | August 1992 | CA |
63164782 | July 1988 | JP |
03072309 | March 1991 | JP |
4336091 | November 1992 | JP |
5048964 | February 1993 | JP |
2004173083 | June 2004 | JP |
9016131 | December 1990 | WO |
9300647 | January 1993 | WO |
- Vu, Ngoc-Yen; PCT Search Report, dated May 15, 1996, PCT No. PCT/US96/01434; 1 page.
- Vu, N.; PCT Examination Report, dated Mar. 10, 1997; PCT No. PCT/US96/01434; 6 pages.
- Non-Final Office Action dated Apr. 14, 2010; U.S. Appl. No. 12/126,664; 20 pages.
- Radatz, J.; “gimbal”, The IEEE Standard Dictionary of Electrical and Electronic Terms; 1997, New York, NY; IEEE Standards Office, Sixth Edition, p. 454.
- Mohl, R.; “Development and Implementation of the Movie Map,” Chapters 3-6, pp. 87-227. 1981.
- Fisher, S.S. et al.; “Virtual Environment Display System”, Interactive 3D Graphics, Oct. 23-24, pp. 77-87, 1986.
- Fisher, S. S.; “Viewpoint Dependent Imaging: An Interactive Stereoscopic Display”; Department of Architecture; Oct. 8, 1981. pp. 1-77.
- Mohl, R.; “Development and Implementation of the Movie Map,” Chapters 1 and 2 (pp. 1 to 86, particularly the overview description presented on p. 9 and the system configuration description presented on pp. 56-81) of “Cognitive space in the interactive movie map: an investigation of spatial learning in virtual environments.” M.I.T. Ph. D. Thesis (Dept. of Architecture, MA Institute of Technology, Cambridge, MA, 1982). (Downloadable from MIT Libraries DSpace Citable URI: http://hdl.handle.net/1721.1/15702).
- Wilson, K.S., “The Palenque Optical Disc Prototype: Design of Multimedia Experiences for Education and Entertainment in a Nontraditional Learning Context. Technical Report No. 44”, Center for Children and Technology, Bank Street College of Education, May 1987, pp. 1-15, 19 pages.
- Ackermann, F., “On the Status and Accuracy Performance of GPS Photogrammetry”, University of Stuttgart, Institute for Photogrammetry, 1994, pp. 80-90, Stuttgart, Germany, 11 pages.
- Arons, B., “Mit's Sampler Disc of Disc Techniques”, Educational and Industrial Television, Jun. 1984, pp. 36-40, vol. 16, No. 6, 4 pages.
- Aukstakalnis, S. et. al., “Silicon Mirage: The Art and Science of Virtual Reality”, 1992, Part III, pp. 183-208, Peachpit Press, Inc., Berkeley, United States of America, 33 pages.
- Bao-Zong, Y. et. al., “Tutorial: Computer Vision-Towards a Three-Dimensional World”, Engineering Applications of Artificial Intelligence, Jun. 1989, pp. 94-108, vol. 2, 15 pages.
- Blinn, J., “Where Am I? What Am I Looking At?”, Jim Blinn's Corner, IEEE Computer Graphics and Applications, Jul. 1988, pp. 76-81, 6 pages.
- Bove, Jr., V. M., “Pictorial Applications for Range Sensing Cameras”, SPIE Image Processing, Analysis, Measurement, and Quality, 1988, pp. 10-17, vol. 901, 8 pages.
- Brooks, Jr., F.P., “Grasping Reality Through Illusion: Interactive Graphics Serving Science”, University of North Carolina, Department of Computer Science, Mar. 1988, pp. 1-13, Chapel Hill, United States of America, 15 pages.
- Brooks, T. L. et. al., “Operator Vision Aids for Telerobotic Assembly and Servicing in Space”, Proceedings of the 1992 IEEE International Conference on Robotics and Automation, May 1992, pp. 886-891, Nice, France, 6 pages.
- Stoker, C.R. et al, “Antarctic Undersea Exploration Using a Robotic Submarine with a Telepresence User Interface”, Environmental Applications of AI, IEEE Expert, Dec. 1995, pp. 14-23, 10 pages.
- Teodosio, L. A. and Mills, M., “Panoramic Overviews for Navigating Real-World Scenes”, Massachusetts Institute of Technology and Apple Computer, 1993, 6 pages.
- Dixon, D., “DVI Pilot Applications”, Manifest Technology: Making Sense of Digital Media Technology, Copyright 1999-2012, 5 pages.
- Drucker, S. M. et. al., “Intelligent Camera Control in a Virtual Environment”, Proceedings of Graphics Interface '94, 1994, pp. 190-199, 10 pages.
- Drucker, S. M., “Intelligent Camera Control for Graphical Environments”, Program in Media Arts and Sciences, School of Architecture and Planning, Massachusetts Institute of Technology, Jun. 1994, pp. 1-207, Cambridge, United States of America, 207 pages.
- El-Sheimy, N. et. al., “Kinematic Positioning in Three Dimensions Using CCD Technology”, Vehicle Navigation and Information Systems Conference, 1993, pp. 472-475, Ottawa, Canada, 4 pages.
- El-Sheimy, N., “A GPS/INS Aided Video Camera System for Rapid GIS Surveys in Urban Centers”, 7th International Technical Meeting of the Satellite Division of the Institute of Navigation, Sep. 1994, pp. 1357-1366, Salt Lake City, United States of America, 11 pages.
- Fisher, S.S., “Viewpoint Dependent Imaging: An Interactive Stereoscopic Display”, Massachusetts Institute of Technology, 1982, pp. 1-77, 77 pages.
- Fitzmaurice, G.W., et al., “Virtual Reality for Palmtop Computers”, ACM Transactions on Information Systems, Jul. 1993, pp. 197-218, vol. 11, No. 3, 22 pages.
- Fuller, B. B. et. al., “An Overview of the Magic Project”, The MITRE Corporation, Dec. 1993, pp. 1-8, 14 pages.
- Glenn, S., “Real Fun, Virtually: Virtual Experience Amusements and Products in Public Space Entertainment”, SimGraphics Engineering Corporation, 1992, pp. 77-83, South Pasadena, United States of America, 7 pages.
- Habfeld, S. et. al., “Intraoperative Navigation in Oral and Maxillofacial Surgery”, International Journal of Oral and Maxillofacial Surgery, 1995, pp. 111-119, vol. 24, Denmark, 9 pages.
- Hine III, B.P., et. al., “The Application of Telepresence and Virtual Reality to Subsea Exploration”, The 2nd Workshop on Mobile Robots for Subsea Environments, Proc. ROV'94, May 1994, 11 pages.
- Hitchner, L. E., “The NASA Ames Virtual Planetary Exploration Testbed”, Wescon Conference Record, Nov. 1992, pp. 1-6, Anaheim, United States of America, 6 pages.
- Hitchner, L. E., “Virtual Planetay Exploration: A Very Large Virtual Environment”, Tutorial on Implementing Immersive Virtual Environments, 1992, pp. 6.1-6.16, 16 pages.
- Holloway, R. et. al., “Virtual-Worlds Research at the University of North Carolina at Chapel Hill”, Proceedings of Computer Graphics '91, Nov. 1991, pp. 181-196, London, England, 15 pages.
- Ince, I. et. al., “Virtuality and Reality: A Video/Graphics Environment for Teleoperation”, STX Robotics, 1991, pp. 1083-1089, 7 pages.
- Iwata, H., “Force Display for Virtual Worlds”, Institute of Engineering Mechanics, University of Tsukuba, 1992, pp. 111-116, Tsukuba, Japan, 6 pages.
- Laurel, B. et. al., “Interface and New Interactive Systems”, 1990, pp. 2-1 to 2-13, Transcript, 13 pages.
- Leclerc, Y. G. et. al., “TerraVision: A Terrain Visualization System”, AI Center, SRI International, Apr. 22, 1994, pp. 1-20, 20 pages.
- Lippman, A., “Movie-Maps: An Application of the Optical Videodisc to Computer Graphics”, Architecture Machine Group, Massachusetts Institute of Technology, 1980, pp. 32-42, Cambridge, United States of America, 11 pages.
- McKeown, Jr., D.M. et al., “Research in Automated Analysis of Remotely Sensed Imagery”, Image Understanding Workshop: Proceedings of a Workshop held in Washington, D.C., Apr. 1993, pp. 231-251, Morgan Kaufmann Publishers, Inc., San Mateo, United States of America, 30 pages.
- Miller, G. et. al., “The Virtual Museum: Interactive 3D Navigation of a Multimedia Database”, The Journal of Visualization and Computer Animation, 1992, pp. 183-197, vol. 3, No. 3, 16 pages.
- Mohl, R., “Cognitive Space in the Interactive Movie Map: An Investigation of Spatial Learning in Virtual Environments”, Department of Architecture, Massachusetts Institute of Technology, 1981, pp. 1-226, 232 pages.
- Naimark, M., “Elements of Realspace Imaging: A Proposed Taxonomy”, Electronic Imaging Proceeding, 1991, pp. 1-10, vol. 1457, San Jose, United States of America, 10 pages.
- Naimark, M., “Magic Windows, Magic Glasses, and Magic Doors: Experiencing Place via Media”, Artshow catalog, Kanagawa International Art and Science Exhibition, Oct. 1989, pp. 1-2, Kanagawa, Japan, 2 pages.
- Naimark, M., “Moviemap Basics”, MultiMediale 2 catalog, Zentrum fur Kunst and Medientechnologie, 1991, p. 2, Karlsruhe, Germany, 1 page.
- Naimark, M., “Spatial Correspondence in Motion Picture Display”, Optics in Entertainment II, 1984, pp. 78-81, vol. 462, 4 pages.
- Naimark, M., “The Optical Videodisc and New Media Forms”, Video 80, San Fransisco International Film Festival, 1982, pp. 24-25, 2 pages.
- Naimark, M., “VBK—A Moviemap of Karlsruhe' Mediated Reality and the Consciouness of Place”, Tomorrow's Realities catalog, Siggraph, 1991, p. 2, Las Vegas, United States of America, 1 page.
- Negroponte, N., “The Impact of Optical Videodiscs on Filmmaking”, Massachusetts Institute of Technology, Jun. 1979, pp. 1-29, 32 pages.
- Nitao, J. J., et. al., “Computer Modelling: A Structured Light Vision System for a Mars Rover”, SPIE—The International Society for Optical Engineering, Proceedings: Mobile Robots IV, Nov. 1989, pp. 168-177, vol. 1195, Philadelphia, United States of America, 11 pages.
- Novak, K., “Application of Digital Cameras and GPS for Aerial Photogrammetric Mapping”, Department of Geodetic Science and Surveying, Center for Mapping, The Ohio State University, 1992, pp. 5-9, 5 pages.
- Pausch, R., “Virtual Reality on Five Dollars a Day”, ACM, 1991, pp. 265-270, 6 pages.
- Ripley, G.D., “DVI—A Digital Multimedia Technology”, Journal of Computing in Higher Education, 1990, pp. 74-103, vol. I (2), 30 pages.
- Schwarz, K. P. et. al., “VIASAT—A Mobile Highway Survey System of High Accuracy”, IEE Vehicle Navigation and Information Systems Conference, 1993, pp. 476-481, Ottawa, Canada, 6 pages.
- Sequeira, V. et. al, “3D Environment Modelling Using Laser Range Sensing”, Robotics and Autonomous Systems, 1995, pp. 81-91, vol. 16, 11 pages.
- Shiffer, M. J., “A Geographically-Based Multimedia Approach to City Planning”, ACM, CHI94 Conference Companion, 1994, pp. 265-266, Boston, United States of America, 2 pages.
- Shiffer, M. J., “A Hypermedia Implementation of a Collaborative Planning System”, University of Illinois at Urbana-Champaign, 1992, pp. 1-188, Urbana, United States of America, 202 pages.
- Defendants' Supplemental Response to Plaintiff's First Set of Interrogatories (No. 6), dated Feb. 29, 2012; 19 pgs.
- Defendants' Supplemental Response to Plaintiff's First Set of Interrogatories (No. 6), dated Feb. 29, 2012; Exhibit A; 10 pgs.
- Teodosio, L. A., “Salient Stills”, Media Arts and Science Section, School of Architecture and Planning, Massachusetts Institute of Technology, Jun. 1992, pp. 1-71, Cambridge, United States of America, 71 pages.
- Thorman, C.P., “Using Video as Textural Input to a Computer Graphic Database”, Massachusetts Institute of Technology, Dec. 1988, pp. 1-20, 25 pages.
- Vanderburgh, J.C., “Space Modeler: An Expanded, Distributed, Virtual Environment for Space Visualization”, Air Force Institute of Technology, Dec. 1994, pp. 1-79, 93 pages.
- Ware, C. et. al., “Exploration and Virtual Camera Control in Virtual Three Dimensional Environments”, ACM, 1990, pp. 175-183, 9 pages.
- Wilson, K. G., “Synthetic BattleBridge: Information Visualization and User Interface Design Applications in a Large Virtual Reality Environment”, Air Force Institute of Technology, Dec. 1993, pp. 1-85, 95 pages.
- Wilson, K. S. et. al., “The Palenque Project: A Process of Design and Development as Research in the Evolution of an Optical Disc Prototype for Children. Technical Report No. 47”, Center for Children and Technology, Bank Street College of Education, Dec. 1987, pp. 1-12, 15 pages.
- Wilson, K. S., “Palenque: An Interactive Multimedia Digital Video Interactive Prototype for Children”, Center for Children and Technology, Bank Street College of Education, 1988, pp. 275-279, 5 pages.
- Wilson, K. S., “The Palenque Optical Disc Prototype: The Design of a Multimedia Discovery Based Experience for Children”, Children's Environments Quarterly, 1988, pp. 7-13, vol. 5, No. 4, 7 pages.
- Wu, S. S. C., et. al.,“Photogrammetric Application of Viking Orbital Photography”, Planet Space Sci., 1982, pp. 45-55, vol. 30, No. 1, Pergamon Press Ltd., Great Britain, 10 pages.
- Young, D., Inventing Interactive, http://www.inventinginteractive.com/2010/03/18/a5pen-movie-map/. Printed on Oct. 25, 1922, 77 pages.
- Zhou, Q. et. al., “Development of a Multimedia Spatial Information System”, Invited and Presented Papers of the XVIIth Congress, Technical Commission II, Systems for Data Processing and Analysis, Aug. 1992, pp. 67-71, United States of America, 6 pages.
- Defendants' Supplemental Response to Plaintiff's First Set of Interrogatories (No. 6), dated Feb. 29, 2012; Exhibit B-1, 60 pgs.
- Defendants' Supplemental Response to Plaintiff's First Set of Interrogatories (No. 6), dated Feb. 29, 2012; Exhibit B-2; 66 pgs.
- Defendants' Supplemental Response to Plaintiff's First Set of Interrogatories (No. 6), dated Feb. 29, 2012; Exhibit B-3; 103 pgs.
- Defendants' Supplemental Response to Plaintiff's First Set of Interrogatories (No. 6), dated Feb. 29, 2012; Exhibit B-4; 83 pgs.
- Defendants' Supplemental Response to Plaintiff's First Set of Interrogatories (No. 6), dated Feb. 29, 2012; Exhibit B-5; 68 pgs.
- Defendants' Supplemental Response to Plaintiff's First Set of Interrogatories (No. 6), dated Feb. 29, 2012; Exhibit B-6; 68 pgs.
- Defendants' Supplemental Response to Plaintiff's First Set of Interrogatories (No. 6), dated Feb. 29, 2012; Exhibit B-7; 54 pgs.
- Defendants' Supplemental Response to Plaintiff's First Set of Interrogatories (No. 6), dated Feb. 29, 2012; Exhibit B-8; 65 pgs.
- Defendants' Supplemental Response to Plaintiff's First Set of Interrogatories (No. 6), dated Feb. 29, 2012; Exhibit B-9; 83 pgs.
- Microsoft Corporation's Answer and Counterclaims to Transcenic, Inc.'s First Amended Complaint for Patent Infringement; filed Sep. 12, 2011; 11 pgs.
- Mapquest, Inc. and AOL Inc.'s Answer, Defenses, and Counterclaims to Transcenic, Inc.'s Complaint for Infringement; filed Sep. 12, 2011; 12 pgs.
- Complaint for Patent Infringement; filed Jul. 1, 2011; 97 pgs.
- Defendant Goggle Inc.'s Answer and Counterclaims to Transcenic, Inc.'s First Amended Complaint for Patent Infringement; filed Sep. 12, 2011; 13 pgs.
- First Amended Complaint for Patent Infringement; filed Aug. 24, 2011; 99 pgs.
- Notice of Service for Defendant's Supplemental Response to Plaintiff's First Set of Interrogatories (No. 6) dated Feb. 29, 2011; 3 pgs.
- Aspen Interactive Movie Map from 222.youtube.com/watch?v=Hf6LkqgXPMUMeature=related; DEF0005847.
- Aspen Movie Map from www.youtube.com/watch?v=w18MyqszIYc; DEF0005848.
- Bank Street College of Education; Palenque Color Slides; DEF0005766-DEF0005835.
- Bank Street College; “Palenque Project—PANS”; Oct. 6, 1986; DEF0005143-DEF0005146.
- Beers, B.J.; “Frank 2: System for Photographical Registration for Analogue Picture Presentation and Map Making. Second Phase of Examination”; Jan. 1985; DEF0005851-DEF0005883.
- Beers, B.J.; “Frank—the design of a new landsurveying system using panoramic images”; 1985; DEF0005884-DEF0006042.
- Bogaerts, J.M.; Frank: System for Photographical Registration for Analogue Picture Presentation and Map Making; Mar. 1982; DEF0006296-DEF0006362.
- Brock, B.; “Computers in the Classroom, Palenque Model is an Innovative Videodisc Project, Developed by the Bank Street College in New York, has Added an Adverturous Edge to Learning”; Oct. 6, 1987; DEF0005139-0005142.
- Carson, K. M.; “A Color Spatial Display Based on a Raster Framebuffer and Varificoal Mirror”; Feb. 1985; DEF0005034-DEF0005104.
- Charnley, D, et al.; “Surface Reconstruction from Outdoor Image Sequences”; 1988, DEF0005357-DEF0005362.
- Computer World, Magasin; 1987; DEF0004251-DEF0004260.
- Defendants' Fourth Supplemental Response to Plaintiff's First Set of Interrogatories (No. 6); dated Jul. 13, 2012; 72 pgs.
- Defendants' Second Supplemental Response to Interrogatory No. 6; Exhibit A; dated Jun. 29, 2012; 20 pgs.
- Defendants' Second Supplemental Response to Interrogatory No. 6; Exhibit B-10; dated Jun. 29, 2012; 58 pgs.
- Defendants' Second Supplemental Response to Interrogatory No. 6; Exhibit B-11; dated Jun. 29, 2012; 32 pgs.
- Defendants' Second Supplemental Response to Interrogatory No. 6; Exhibit B-12; dated Jun. 29, 2012; 31 pgs.
- Defendants' Second Supplemental Response to Plaintiff's First Set of Interrogatories (No. 6); dated Jun. 29, 2012; 36 pgs.
- Defendants' Third Supplemental Response to Interrogatory No. 6; Exhibit B-13; dated Jul. 6, 2012; 67 pgs.
- Defendants' Third Supplemental Response to Interrogatory No. 6; Exhibit B-14; dated Jul. 6, 2012; 64 pgs.
- Defendants' Third Supplemental Response to Plaintiff's First Set of Interrogatories (No. 6); dated Jul. 6, 2012; 52 pgs.
- D'Ignazio, F.; “d'iversions, An Electronic Field Trip”; Compute!'s Gazette, d'iversion, Murder at Palenque; Sep. 1988; DEF0004261-DEF0004262.
- Discursions Architecture Machine Group—Video; DEF0005846.
- Dixon, D.F., et al.; Computer Graphics World; “DVI Video/Graphics”; Jul. 1987; DEF0004263-DEF0004266.
- Friedman, Gary L.; “The Trustworthy Digital Camera: Restoring Credibility to the Photographic Image”; Dec. 1992 to Nov. 1994, DEF0005371-DEF0005377.
- Haas; V.; “Frank-Enkele aspecten van de verwerking van waarnemingen gedaan met het uitwekingsinstrument; de meetmethodiek, instrumentele fouten en voorverweking van de waarnemingen”; 1988; DEF0006043-DEF0006181.
- Halfhill, T.R.; “Byte Magazine, See You Around”; May 1995; DEF0006823-DEF0006830.
- Hanson, et al.; “Overview of the SRI Cartographic Modeling Environment”; Jan. 1992; DEF0006888-DEF0006904.
- Harvard Gazette; “Multi-Media Computer Project Offers a Trip to Mayan Ruins”; Apr. 29, 1988; DEF0004292-DEF0004293.
- Information Zooms, pp. 57-66; DEF0005836-DEF0005845.
- Instructor, Special High-Tech Issue; 1987; DEF0004383-DEF0004419; 37 pgs.
- Interactive Media Manifestatie-International Conference Eindhoven, The Netherlands; Jun. 1987; DEF0004420-DEF0004426.
- Kunkel, P.; “Hyper Media”; Mar./Apr. 1989; DEF0004791-DEF0004793.
- LaserActive '87 Applications and Innovations; Sep. 29 to Oct. 2, 1987; DEF0004794-DEF0004826.
- Les Chemins Du Virtuel-Simulation infromatique et creation industrielle; 1989; DEF0004827-DEF0005018.
- Loveria, G., et al.; “MultiMedia: DVI Arrives”; Fall, 1990; DEF0005019-DEF0005022.
- Luther, A.C.; “IEEE Spectrum-DVI: how it works, You are there and in control”; pp. 45-50; Sep. 1988; DEF0005236-DEF0005254; 119 pgs.
- Manes, S.; “The Road to Respect”; Mar. 1989; DEF0005023-DEF0005033.
- McMillian, L., et al.; “Plenoptic Modeling: An Image-Based Rendering System, Department of Computer Science University of North Carolina at Chapel Hill”; Aug. 1995; DEF0005169-DEF0005176.
- Montgomery, H.; “Task Force Differential, In-Flight Differential, and Video Van Mapping”; Feb. 1994; DEF0006498-DEF0006502.
- Naimark, M.; “VBK—A Moviemap of Karlsruhe” Mediated Reality and the Consciousness of Place; Tomorrow's Realities catalog, Siggraph '91, Las Vegas; DEF0005105-DEF0005107.
- National Association of Home Builders; “Virtual Reality, Masco Corporation's Walk-through Software”; May 1996, DEF0005626-DEF0005631.
- National Convention Center, Boston, MA, Jun. 20-22, 1989; “National Educational Computing Conference—Final Program”; DEF0005108-DEF0005128.
- Palenque Handwritten Notes; DEF0004267-DEF0004291.
- Palenque Log Sheet—Team II; dated Apr. 7, 1986; DEF0005129-DEF0005138.
- Palenque Photographs; DEF0005759-DEF0005760.
- Palenque Photographs; DEF0005761-DEF0005762.
- Palenque Photographs; DEF0005763-DEF0005765.
- Palenque Route Numbers; DEF0005187-DEF0005198.
- Palenque: Optical Disc Prototype Video; DEF0005849.
- Perry, T.S.; Science Observer—A New World of Viewer-Controlled Video; Mar. 1989; DEF0005199-DEF0005201.
- Blaho, G.; Field Experiences with Fully Digital Mobile Stereo Image Acquisition System; May 24-26, 1995; DEF0007551-DEF0007560.
- Bossler, et al.; Accuracies Obtained by the GPSVan; Nov. 14-16, 1995; DEF0007498-DEF0007507.
- Bossler, et al.; GPS and GIS Map the Nation's Highways; Mar. 1991; DEF0007473-DEF0007484.
- Bossler, et al.; Mobile Mapping Systems: New Tools for the Fast Collection of GIS Information; Mar. 23-25, 1993; DEF0007485-DEF0007497.
- Defendants' Fifth Supplemental Response to Plaintiff's First Set of Interrogatories (No. 6); dated Oct. 24, 2012; 90 pgs.
- Defendants' Fifth Supplemental Response to Plaintiff's First Set of Interrogatories (No. 6); dated Oct. 24, 2012; Exhibit B-15; 63 pgs.
- He, et al.; Automatic Analysis of Highway Features from Digital Stereo-Images; 1992; DEF0007508-DEF0007515.
- He, et al.; The Accuracy of Features Positioned with the GPSVan; Jun. 6-10, 1994; DEF0007516-DEF0007524.
- Proceedings 1995 Mobile Mapping Symposium, The Ohio State University Center for Mapping and Geodetic Science and Surveying; May 24-26, 1995; DEF0007525-DEF0007530.
- The Ohio State University Center for the Commercial Development of Space: Real-Time Mapping Satellite Mapping Annual Report 1991; 1991; DEF0007735-DEF0007774.
- The Ohio State University Center for the Commercial Development of Space: Real-Time Mapping Satellite Mapping—Annual Report 1992; 1992; DEF0007617-DEF0007654.
- The Ohio State University Center for the Commercial Development of Space: Real-Time Mapping Satellite Mapping—Annual Report 1993; 1993; DEF0007655-DEF0007680.
- The Ohio State University Center for the Commercial Development of Space: Real-Time Mapping Satellite Mapping—Annual Report 1994; 1994; DEF0007681-DEF0007709.
- The Ohio State University Center for the Commercial Development of Space: Real-Time Mapping Satellite Mapping—Annual Report 1995; 1995; DEF0007710-DEF0007734.
- Toth, C,; A Conceptual Approach to Imaging for Mobile Mapping; May 24-26, 1995; DEF0007540-DEF0007550.
- Toth, C.; Experiences with a Fully Digital Image Acquisition System; Feb. 27, 1995; DEF0007531-DEF0007539.
- Bove, Jr., M.V.; “Pictorial Applications For Range Sensing Cameras”; Proceedings, SPIE vol. 901; Image Processing, Analysis, Measurement, and Quality; Los Angeles, CA, Jan. 13-15, 1988; DEF0007830-DEF0007840; 11 pages.
- Discursions Architecture Machine Group; Scitex, MIT 1983; DEF0007781-DEF0007782; 2 pages.
- El-Shelmy, N.; “A GPS/INS Aided Video Camera System for Rapid GIS Surveys in Urban Centers”; The Institute of Navigation, Proceedings of ION GPS-94; 7th International Technical Meeting of the Satellite Division of the Institute of Navigation; Part 2, Salt Palace Convention Center, Salt Lake City, Utah; Sep. 20-23, 1994; DEF0006761-DEF0006771; 11 pgs.
- Friedman, G. L.; “The Trustworthy Digital Camera: Restoring Credibility to the Photographic Image,” IEEE Transactions on Consumer Electronics, Nov. 1993, vol. 39, No. 4; DEF0007861-DEF0007868; 8 pages.
- Glenn, S.; “Real Fun, Virtually: Virtual Experience Amusements and Products in Public Space Entertainment”; Beyond The Vision, The Technology, Research, and Business of Virtual Reality; Proceedings of Virtual Reality '91, The Second Annual Conference on Virtual Reality, Artificial Reality, and Cyberspace; San Francisco, Sep. 23-25, 1991; DEF0007841-DEF0007850; 10 pages.
- Goad, C.C.; “The Ohio State University Highway Mapping Project: The Positioning Component”; Department of Geodetic Science and Surveying, The Ohio State University; DEF0008553-DEF0008556; 4 pages.
- He, G., et al.; “Spatial Data Collection With The GPSVan Mobile Mapping System”; ISPRS International Society for Photogrammetry and Remote Sensing, vol. 30, Part 4; Proceedings of the Symposium Mapping and Geographic Information Systems; May 31-Jun. 3, 1994, Athens, GA; DEF0008470-DEF0008479; 10 pages.
- Ince, I, et al.; “Virtuality and Reality: A Video/Graphics Environment for Teleoperation”; Conference Proceedings, 1991 IEEE International Conference on Systems, Man, and Cybernetics; Oct. 13-16, 1991, vol. 2, School of Engineering and Applied Science; DEF0007878-DEF0007886; 9 pages.
- Loveria, G., et al.; “MultiMedia: DVI Arrives”; Byte IBM Special Edition; Seventh Annual Extra All-IBM Edition; “Guideposts for the 90s”; vol. 15, No. 11, Fall, 1990; DEF0007785-DEF0007790; 6 pages.
- McDonald, N.H., et al.; “Video Graphic Query Facility Database Design”; Department of Computer Science and Engineering; University of South Florida, Tampa, Florida 33620; 1981; DEF0008568-DEF0008577; 10 pages.
- McKeown, Jr., D. M., et al.; “Research in Automated Analysis of Remotely Sensed Imagery”; Proceedings: Image Understanding Workshop; Defense Advanced Research Projects Agency, Apr. 1993, DEF0006831-DEF0006860; 30 pages.
- Naimark, M.; “Elements of Realspace Imaging: A Proposed Taxonomy”; Proceedings SPIE—The International Society for Optical Engineering; Stereoscopic Displays and Applications II; vol. 1457; San Jose, CA; Feb. 25-27, 1991; DEF0007791-DEF0007804; 14 pp.
- Nitao, J. J., et al.; “Computer modelling: a structured light vision system for a Mars rover”; Proceedings, Mobile Robots IV; Nov. 6-7, 1989, Philadelphia, PA; SPIE vol. 1195; DEF0006877-DEF0006887; 11 pages.
- Novak, K.; “Application of Digital Cameras and GPS For Aerial Photogrammetric Mapping”; ISPRS, Washington, D.C., 1992; International Archives of Photogrammetry and Remote Sensing; vol. XXIX, Part B4; DEF0008527-DEF0008540; 14 pages.
- Schwarz, K. P., et al.; “VIASAT—A Mobile Highway Survey System of High Accuracy”; Proceedings of the IEEE-IEE Vehicle Navigation and Informations Systems Conference; Ottawa, Ontario, Oct. 12-15, 1993; DEF0007869-DEF0007877; 9 pages.
- Teodosio, L. A.; “Panoramic Overviews for Navigating Real-World Scenes”; Proceedings ACM Multimedia 93, Anaheim, CA, Aug. 1-6, 1993; DEF0007822-DEF0007829; 8 pages.
- The Center for Mapping, The Ohio State University, Columbus, OH; Dec. 1, 1991; “The GPS/Imaging/GIS Project”; Application of the Global Positioning System for Transportation Planning: A Multi-State Project; DEF0008244-DEF0008460; 214 pages.
- Tomasi, C., et al.; “Shape and Motion From Image Streams: A Factorization Method”; Proceedings of the National Academy of Sciences of the United States of America; Nov. 1, 1993, vol. 90, No. 21; DEF0007851-DEF0007860; 10 pages.
- Toth, C.K.; “Imaging Component of Mobile Mapping Systems”, brochure; Center for Mapping; The Ohio State University; DEF0008494-DEF0008526; 33 pages.
- Ware, C., et al.; “Exploration and Virtual Camera Control in Virtual Three Dimensional Environments”; Proceedings 1990 Symposium on Interactive 3D Graphics; Snowbird, Utah, Mar. 25-28, 1990; DEF0007805-DEF0007816; 12 pages.
- Zhou, Q., et al.; “Development of a Multimedia Spatial Information System”; ISPRS, Washington, D.C., 1992; International Archives of Photogrammetry and Remote Sensing; vol. XXIX, Part B2; DEF0008541-DEF0008552; 12 pages.
- Poelman, C.J., et al.; “A Paraperspective Factorization Method for Shape and Motion Recovery”; Oct. 29, 1992; DEF0004181-DEF0004210.
- Poelstra, T.J.; “The Frank System: First Results”; 1993, DEF0006473-DEF0006483.
- Ressler, S.; “Approaches Using Virtual Environments with Mosaic”; Dec. 22, 1994; DEF0006484-DEF0006491.
- Robinett; W.; “Synthetic Experience: A Taxonomy, Survey of Earlier Thought, and Speculations on the Future”; DEF0006905-DEF0006934.
- San Francisco Walking Tour Video; 1994; DEF0005850.
- Schachter, B.J.; “Computer Image Generation”; 1983; DEF0006503-DEF0006750.
- Shenchang, E.; “QuickTime VR—An Imaged-Based Approach to Virtual Environment Navigation”; 1995; DEF0005177-DEF0005186.
- Sniffer, M.; Augmenting Geographic Information with Collaborative Multimedia Technologies; Nov. 1989; DEF0006935-DEF0006952.
- Smith, K.W.; “Verkenningsberekeningen Met Het Frank-System”; Sep. 1991; DEF0006363-DEF0006472.
- Stanton, D.; Fire Drill Couldn't Chase Us Away From NECC's Multimedia Demos; Oct. 1989; DEF0005355- DEF0005356.
- Szeliski, R.,; “Image Alignment and Stitching: A Tutorial”; Dec. 10, 2006; DEF0004294-DEF0004382.
- Teaching and Research—Using Technology Tools—1988 OIT Colloquium Series; DEF0005366-DEF0005370.
- The Aspen Institute Chronicle, “Enhancing the Social Benefits of New Electronic Technologies”; vol. 2, No. 4; Winter, 1988; DEF0004233-DEF0004238.
- The Development of the Frank Surveying System and the Frank Image Viewing System; Jul. 1993; DEF0006290-DEF0006295.
- The Frank Image Retrieval System; DEF0006288-DEF0006289.
- Tomasi, C, et al.; “Shape and Motion from Image Streams: a Factorization Method—Full Report on the Orthographic Case”; Mar. 1992; DEF0005202-DEF0005235.
- Tsai, R.Y.; “A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses”; Aug. 1987; DEF0004211-DEF0004232.
- Vincent, L.; “Taking Online Maps Down to Street Level”; Dec. 2007; DEF0005363-DEF0005395.
- Wilson, K., et al.; “Palenque Project Production Plan”; Mar. 24, 1986; DEF0005147-DEF0005168.
- Wilson, K.; “The Palenque Design: Children's Discovery Learning Experiences in an Interactive Multimedia Environment”; 1988; DEF0004436-DEF0004695.
- Wilson, K.S.; “Bank Street College of Education for Children and Technology, Palenque: A Multimedia Video Interactive Prototype for Children”; Jun. 1990; DEF0005632-DEF0005645.
- Wilson, K.S.; “Palenque Project Design”; Mar. 1986; DEF0005646-DEF0005715.
- Wilson, K.S.; “The Palenque Optical Disc Prototype: Design of Multimedia Experiences for Education and Entertainment in a Nontraditional Learning Content”; 1988; DEF0003611-DEF0003629.
- Wolters, K.; “Beeldmatching Tijends Het Frank-Meetproces”; 1994; DEF0006182-DEF0006287.
- Appendix to Plaintiff Transcenic, Inc.'s Claim Consruction Opening Brief dated Sep. 6, 2012; 96 pages.
- Appendix to Plaintiff Transcenic, Inc.'s Claim Construction Answering Brief dated Oct. 5, 2012; 490 pages.
- Declaration of Anne Shea Gaza in Support of Defendants' Opening Claim Construction Brief dated Sep. 6, 2012; 23 pages.
- Declaration of Dr. Chandrajit L. Bajaj dated Sep. 6, 2012; 218 pages.
- Declaration of Jason J. Rawnsley in Support of Defendants' Responsive Claim Construction Brief dated Oct. 5, 2012; 21 pages.
- Declaration of Joseph L. Mundy on Behalf of Defendants dated Oct. 5, 2012; 105 pages.
- Defendants' Opening Claim Construction Brief dated Sep. 6, 2012; 39 pages.
- Defendants' Responsive Claim Construction Brief dated Oct. 5, 2012; 61 pages.
- Joint Claim Construction Chart filed Jul. 30, 2012; 254 pages.
- Plaintiff Transcenic, Inc.'s First Supplemental Responses and Objections to Defendants' Second Set of Common Interrogatories (No. 4) including exhibits, dated Jan. 20, 2013; 499 pages.
- Transcenic's Amended Claim Construction Chart dated Sep. 6, 2012; 16 pages.
- Transcenic's Claim Construction Answering Brief dated Oct. 5, 2012; 30 pages.
- Transcenic's Claim Construction Opening Brief dated Sep. 6, 2012; 37 pages.
- Plaintiff Transcenic, Inc.'s Responses and Objections to Defendants Second Set of Common Interrogatories (No. 4); dated May 18, 2012; 1459 pgs.
- Order for Memorandum Opinion issued Sep. 17, 2013, in litigation case No. 11-cv-582 pending in the U.S. District Court for the District of Delaware; 5 pages.
- Memorandum Opinion regarding claim construction for U.S. Pat. No. RE42,289, dated Sep. 17, 2013, in litigation case No. 11-cv-582 pending in the U.S. District Court for the District of Delaware; 19 pages.
- Jane Radatz, “gimbal”, The IEEE Standard Dictionary of Electrical and Electronic Terms, 1997, New York, NY: IEEE Standards Office, Sixth Edition, p. 454.
Type: Grant
Filed: Apr 11, 2011
Date of Patent: Jun 3, 2014
Assignee: Transcenic, Inc. (Lake Charles, LA)
Inventor: Robert S. Vincent (Wakefield, MA)
Primary Examiner: Twyler Haskins
Assistant Examiner: Carramah J Quiett
Application Number: 13/084,087
International Classification: H04N 5/222 (20060101); H04N 5/232 (20060101); H04N 5/77 (20060101); H04N 5/228 (20060101);