Processor aided firing of small arms

-

A digital processor aiming and firing system generates a trigger signal with electronic timing exactness, resulting in shooting accuracy unobtainable by humans. To achieve this, a view down the barrel sight is captured by a digital video camera and analyzed on a frame-by-frame basis by an electronic processor equipped with image identification software. Motion detectors attached to the weapon are used to interpolate the barrel position between frames. A motion history of the barrel position relative to the target is calculated and an extrapolation of the future position is made. When the anticipated barrel direction impinges on the target, corrected for motion and ballistic effects, the processor signals the launch of the projectile.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of U.S. provisional application Ser. No. 60/502,693 filed Sep. 12, 2003, which is incorporated herein by reference and made a part hereof.

FEDERALLY SPONSORED RESEARCH

The U.S. Government has a paid-up license in this invention and the right in limited circumstances to require the patent owner to license others on reasonable terms as provided for by the terms of DMB07-03-D-B009 awarded by the Department of Defense.

BACKGROUND OF THE INVENTION

1. Field of Invention

This invention applies to firearms, aiming and triggering, and fire control thereof.

2. Description of Related Art

The ability to aim a firearm and hit a desired target is a challenge that has existed since the advent of modern warfare. With the aiming of conventional firearms, the shooter must align the barrel with the target, or lead the target, while simultaneously pulling the trigger. This act takes considerable practice to master and separates sharpshooters from neophytes.

In the past, numerous innovations have been made in the art of fire control and targeting. Many of the inventions pertain to a weapon that is attached to a stable mounting or quasi-stable mounting such as a moving vehicle. In U.S. Pat. No. 4,787,291, Frohock, Jr., teaches how a system of gyros, servo motors, and a tracking sub-system can be used to pivot a gun to track and fire upon its target. In a similar fashion, in U.S. Pat. No. 4,004,729, Rawicz, et al.; U.S. Pat. No. 3,840,794, Clement et al.; U.S. Pat. No. 3,766,826, Salomonsson; U.S. Pat. No. 5,686,690, Lougheed, et. al., and U.S. Pat. No. 3,575,085, McAdam, Jr., all present tracking systems where the barrel is moved about on larger platforms.

In U.S. Pat. No. 5,966,859, Samuels presents a scheme for automatic firing based on an “electromagnetic signature”, e.g. a thermal detector that distinguishes human and animal targets from their surroundings. In a like fashion, U.S. Pat. No. 5,544,439, Grember, et al., teaches that an infrared detector can be integrated into a weapon and used to activate the trigger. In both of these approaches, there is a limitation in that the target must emit specific and distinctive wavelengths, which must be distinguishable from the target background.

In U.S. Pat. No. 3,659,494, Philbrick, et al. presents a targeting system where the “dancing” motion of a target observed while viewing is attenuated. A pair of gyros (not to be confused with the gyrator in the present invention) are used to drive deflection circuitry and shift the image that is focused on a photosensitive detector. The principle is extended to targeting where a signal indicating target alignment with a “home” position is exploited to activate a trigger.

In U.S. Pat. No. 5,392,688, Boutet, et al., gyro-lasers are used to determine the weapon position and initiate firing commands when the weapon points a theoretical target center constructed by motion averaging.

It is therefore, an object of the invention to provide and improved system and method for automated firing.

SUMMARY OF THE INVENTION

This invention combines the traditional aspects of firearm design with the capabilities of modern microelectronics. Instead of relying on the shooter to pull the trigger at the exactly correct instant to hit a target, a high-speed processor in conjunction with a video system, and motion detectors accomplishes this function. The implications, especially in warfare, are quite profound: with a modest amount of training, any soldier could be converted into a sharpshooter resulting in several benefits: increased lethality, increased survivability, battlefield dominance and more efficient use of ammunition.

It is an object of the invention to provide an improved system and method that increases the probability of hitting a desired target by reducing reliance on a human to correctly time pulling a trigger.

Another object of the invention is to provide a system and method for facilitating reducing reliance on a shooter's ability to hold a gun steady while simultaneously pulling the trigger by providing an electronic processor and triggering system to launch a projectile, such as a bullet at the proper time.

Another object of the invention is to provide a system and method for moving or gyrating a firearm barrel, for capturing a plurality of images of a target area, for processing image data associated with the plurality of images on a frame-by-frame basis, for predicting a position of the barrel and the target and for firing the firearm to hit the target.

Another object of the invention is to provide a system and method for allowing the weapon or gun to be held by the shooter and to quote “free float” thereby reducing or eliminating the need for a supporting platform to act as a base to move the weapon or a reference stage for a directional feedback information supplied to a firing control system.

Another object of the invention is to reduce or eliminate the need for targets to be specific wavelength emitters and to permit targets to be selected from anyplace in the field of view of the imaging system.

Still another object is to reduce or eliminate the necessity to define a “home” position as is required by some systems of the past and to provide means for defining a home position using at least one motion sensor.

Still another object is to provide a system and method for enabling target selection by selecting a point or object within an image captured by a digital camera associated with the weapon.

Still another aspect of the invention is to provide a weapon comprising a firearm having a barrel and a user interface, a barrel oscillator for oscillating the barrel in a predetermined pattern, an image capture device mounted on the firearm for capturing a plurality of image frames of a target and generating image data in response thereto, at least one motion sensor mounted on the firearm for sensing a motion of the barrel and generating motion data in response thereto, and a processor coupled to the user interface, the image capture device and the at least one motion sensor, the processor enabling a user to select a target and in response thereto, causing the image capture device to capture the plurality of images and generate the image data which is used along with the motion data to determine a predicted target location and coverage point where the barrel covers the target upon which the processor may energize the firearm to fire a projectile.

Yet another aspect of the invention comprises a weapon comprising a firearm comprising a barrel, an imager mounted to the barrel for capturing an image of a target area, a user interface for displaying the image, the user interface comprising a trigger for selecting a target within the image area, and a processor coupled to the user interface and the imager for determining a future target location of the target and for automatically firing the firearm when the barrel is positioned in a firing position such that a projectile discharged from the firearm will hit the target selected by the user.

Still another aspect of the invention comprises a weapon comprising: a firearm comprising a barrel, a gyrator mounted on the barrel for gyrating the barrel in a consistent motion, an imager mounted to the firearm for capturing a plurality of images of an area, a user interface for displaying at least one of the plurality of images, the user interface comprising a trigger for selecting a target within the at least one of the plurality of images, and a processor coupled to the user interface, the imager and the gyrator, the processor receiving image data corresponding to on or more of the plurality of images captured causing the firearm to automatically discharge a projectile from the firearm when the barrel is positioned in a firing position such that the will hit the target selected by the user.

Yet another aspect of the invention comprises a method for increasing accuracy of hitting a target with a firearm, the method comprising the steps of: capturing a plurality of images of a target area including the target, processing the plurality of images to predict an optimum firing condition, anddischarging the firearm when the optimum firing condition is achieved.

Still another aspect of the invention comprises a firing system that automatically launches the projectile comprising of: a barreled firearm, an electronic digital camera that supplies the electronic processor with rapid, digital, repetitive frame information, motion sensors that supply the electronic processor with angular rate information, an electronic display, which is able to display the image data from the digital camera and display cross hairs for target identification, a computer mouse, which interacts with the electronic processor and is able to position the cross hairs to identify the desired target, an electronic processor, that receives data from the electronic digital camera, motion sensors, computer mouse, and transmits images to the electronic display, executes barrel prediction algorithms while analyzing the motion generated by human drift and mechanically forced motion from the barrel gyrator and finally transmits a fire signal to the trigger mechanism, a trigger mechanism, which implements the projectile launch signal generated by the electronic processor, a barrel gyrator, which forces an orbital motion on the firearm, which is analyzed by the electronic processor.

Still another aspect of the invention comprises an automatic firing system for use with a firearm, comprising: an image capture device mounted on the firearm for capturing a plurality of images of an area in front of a muzzle end of the firearm, and a processor coupled to the image capture device for processing data associated with the plurality of images and for determining an optimum firing time to discharge a bullet from the firearm in order to hit a target selected by a user.

Yet another aspect of the invention comprises a firing system that automatically launches the projectile comprising of: a barreled firearm, an electronic digital camera, that supplies the electronic processor with digital rapid, repetitive frame information, an electronic display, which is able to display the image data from the digital camera and display cross hairs for target identification, a computer mouse, which interacts with the electronic processor and is able to position the cross hairs to identify the desired target, an electronic processor, that receives data from the electronic digital camera, computer mouse, transmits the data to the electronic display, and runs barrel prediction algorithms while analyzing the motion generated by human drift and transmits a fire signal to the trigger mechanism, and a trigger mechanism, which implements projectile launch by a signal from the electronic processor.

Other advantages of the present invention includes:

    • 1. provides means for hitting the desired target virtually every time;
    • 2. decreases the time needed to shoot at a target;
    • 3. in warfare, it saves ammunition by eliminating the need to spray bullets;
    • 4. prevents shooting a wrong, intervening target, when an extraneous object occludes the line of sight to the desired target;
    • 5. provides means whereby a typical soldier would immediately have the capabilities to be a sharp shooter;
    • 6. provides means whereby the lethality of a soldier would be greatly enhanced;
    • 7. provides means whereby the survivability of a soldier would be vastly increased; and,
    • 8. provides a psychological deterrent to the enemy; and
    • 9. provides users to select non-lethal target areas on, for example an enemy soldier or criminal.

Other objects and advantages of the invention will be apparent from the following description and the accompanying drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a schematic diagram of one embodiment of the invention;

FIG. 2 is another view of the embodiment shown in FIG. 1 during operation;

FIG. 3 is a perspective fragmentary view of barrel gyrator according to one embodiment of the invention;

FIG. 4 is a system processing schematic according to one embodiment of the invention;

FIG. 5 is a view of an image of an area that includes a target;

FIG. 6 is a view of a technique for determining various vectors when an embodiment is used with angular moving gun and a moving target;

FIG. 7 is a view of a plurality of image data pixels associated with a gyration of a barrel;

FIG. 8 is a view of a technique for determining a drift sinusoid function associated with various image data points;

FIG. 9 is a perspective view of a plurality of vectors used to determine a target and barrel position as viewed from a camera mounted on the firearm;

FIG. 10 is a schematic diagram of the system operation in accordance with one embodiment of the invention; and

FIG. 11 is a schematic diagram illustrating an increasing system performance with use of digital camera, barrel gyrator and motion sensors in various combinations.

DETAILED DESCRIPTION OF A PREFERRED EMBODIMENT

Referring now to FIG. 1, a general schematic diagram of the weapon system 100 in accordance with one embodiment of the invention is shown. The system comprises a firearm, gun, pistol/weapon 10, a digital camera 1 and a barrel gyrator 6 are operatively mounted to the weapon 10. In the embodiment being described and as will be discussed later herein relative to FIG. 3, the barrel gyrator 6 induces a gyration, wobble or movement of consistent motion of the barrel 10a in a predictable manner such that the barrel 10a of weapon 10 can be controlled and predicted in the manner described herein.

In general, the digital camera 1 provides digital image data to frame capture electronics 18, which generates a plurality of frames of data and subsequently feeds the data to processor 4. In one embodiment, the weapon 10 further comprises one or a plurality of inertial motion sensors 5 coupled to a computer or processor 4 that sense the position and motion of the weapon and particularly the barrel 10a for generating motion data relative to the motion of the weapon 10 and particularly the barrel 10a. The weapon motion data is fed to processor 4 as illustrated. The use of the motion data will be described later herein.

The weapon 10 further comprises a firing trigger 9, which in the preferred embodiment is an electronic trigger. In the embodiment being described, an order to fire is delivered from processor 4 which generates the order to fire signal in response to the signals received from inertial motion sensors 5 and frame capture electronics 8 and a user interface 2 which enables a user to initiate and authorize the firing process.

In the preferred embodiment, the processor 4 analyzes input data collected from the user interface 2, digital camera 1, and sensors 5 to make a firing decision. Image frame data is received by the processor 4 and undergoes a variety of processing operations to identify and track targets.

When selecting a target, such as target 50 in FIG. 2, the shooter commands the user interface 2 to display a single captured frame 52 of an image on the display screen 3 as a still image. From the user interface controls, the user can then electronically move a cursor 51 using a miniature track ball 55 (FIG. 5) over the image to precisely designate a target 50. For instance, if the target is a soda can on a log as in FIG. 5, the user can place a boxed-shaped cursor 51 around the soda can to designate the target. The center of the box cursor would have a cross hair to represent the desired target hit location.

It should be understood that the invention may be used with a stationary target or a moving target. Also, a user may selectively operate the system 100 either with or without gyration. In the case of stationary targets, even though the target is fixed, the weapon 10 is usually moving if the user is shooting offhand. As a result, as the target moves, the target will be located at different locations in successive image frame due to the gun's motion. By tracking the target, the camera 1 will capture a different image in each video frame, the pointing direction of the weapon 10 relative to the target is known and can be used to predict future positions of the weapon 10.

Once the cursor 51 is positioned on and designates the target in the displayed still image, algorithms are run in the processor 4 to identify unique characteristics of the target so the target can be identified and located in video frames as they are acquired by the digital camera 1. An example of an algorithm that is used in the invention is Normalized Greyscale Correlation (NGC). In the basic form of NGC, the boxed area around the target 50 is used to define a region of interest (ROI) raster. When NGC searches for the target in a new, subsequent image frame, it measures the correlation between an equal sized region of the new image with the reference target raster (51). NGC systematically scans the new image and measures the correlation value at each index in the row, column image scan. If a correlation match is found above a threshold value between the reference and a particular sub-image of a new video frame, the target's center location in the new frame is calculated and recorded. If conventional Pyramiding NGC is used instead of the basic form of NGC, the search time is reduced through the use of a hierarchical searching scheme that is known prior art. The performance of Pyramiding NGC is such that the target location can be found in each frame in real time, even when operating at frame rates of 240 frames per second and using an Intel® Pentium 4 class processor, available from Intel Corporation. These and other prior art image processing methods can identify and track the location of the target in each video frame and may be used by the invention. This processing can also include comparison of past and present frames that are acquired.

Because tracking the weapon's 10 pointing direction is an important component of the invention, rate sensors 5 are used to provide high bandwidth tracking of the gun that compliments the video tracking that is performed. This allows the system 100 to reduce the processing demands needed for gun and target tracking, provides more effective tracking of moving targets and reduces the amount of forward looking prediction needed of the gun's pointing direction. For instance, in the case of stationary targets, the rate sensors 5 can be used to provide gun pointing direction in between image frame times, thus allowing for a slower video frame rate and gun pointing direction predictions that only need to be a few millisecond's in the future. Because the target is stationary, the video tracking is only needed to update the position of the target relative to the gun every 33 to 1000 milliseconds so the random walk error of the rate sensors doesn't grow too large. Without rate sensors, frame rates of 120 frames per second and higher are typical and predictions of gun pointing direction must be made further into the future to account for image acquisition and processing delays before the X,Y target location information is available for calculation.

The ability to predict the future positions of the weapon 10 in the invention is important due to ignition and bore time delays that occur after the decision to fire has been made. To hit a stationary target, the gun must be pointing at it at the time the bullet exits a muzzle or end of the weapon 10 (ignoring gravity and other environment effects). If the gun is in motion and changing its pointing direction, it is not usually sufficient to make a firing decision based on current pointing data because the gun will have changed its pointing direction between trigger and muzzle exit events. The time between these events can be several milliseconds. As data is being acquired and processed, the invention continuously predicts the weapon's 10 pointing direction when muzzle exit would occur if a round was to be fired at that moment. This allows the system to accurately trigger a bullet launch once the predicted pointing direction of the weapon 10 crosses over the target, assuming user authorization to fire has been made. In the general case, if the pointing direction of the weapon 10 satisfies the calculated ballistic solution for the predicted location of the target and the computer has authorization to shoot the processor 4 will send a digital output signal to the firing electronics to initiate the firing trigger 9.

Forward looking predictions of weapon position and target location are made in a variety of ways. Two effective techniques used in the invention are a rate change prediction technique that uses the rate of change calculated from the most recent tracking data points and a curve fitting technique that utilizes several image data points from past and current frames of data tracking information. These techniques are described later herein.

Referring to FIGS. 1 and 5, the user interface 2 in the preferred embodiment includes a small LCD or heads up display 3 to display targeting and system information. Although not shown, the user interface 2 or display 3 may be pivoted or rotated in one of a plurality of directions. The display 3 can be incorporated directly into the sight scope (not shown) for aiming the weapon 10. The user interface 2 can have any combination of control buttons, input controls (including using the conventional pull trigger) to aid the user in selecting targets 25, authorizing firing 26, and configuring parameters within the system 27. In the embodiment being described, a miniature trackball 53 (FIG. 5) is used to guide cursor 51 over the target 50 for firing selection. Rate Sensors 5 on the weapon 10 also allow the possibility of the cursor to be controlled by movement of the gun itself.

The user interface 2 button and input controls are linked to the image display 3 so commands and system 100 configuration settings can be inputted by the user. The user can authorize the system 100 to fire by pressing a button 57 on the user interface 2 or pulling the gun's modified mechanical trigger 10b (FIG. 1) that is configured to only authorize a firing event. The user interface 2 also allows for threshold accuracy settings 28 (FIG. 4) to be inputted so the user can specified the Minute of Angle (MOA) accuracy desired for the shot. One MOA is 1 inch of target size at 100 yards range distance. The threshold accuracy settings 28 are used by the firing algorithm described later herein to determine if the predicted pointing direction of the weapon 10 is over the target 50 at any moment in time.

Referring now to FIGS. 2 and 3, further details of the system 100 are shown. As illustrated in FIG. 3, the digital camera 1 is mounted on the stock 10a by conventional means, such as mechanical mount. The digital camera 1 is aligned generally on top of and over the barrel 10a as illustrated in FIG. 2 so that images of an area including a target, such as target 50 in FIG. 5, can be captured. In this regard, and during a normal operation, digital camera 1 may also capture a plurality of images of the target 50 as the gun barrel 10a is gyrated by gyrator 6 as described herein. Thus, it should be appreciated that the camera 1 is used to electronically photograph the view looking down the barrel and toward the target 50. In one embodiment of the invention, a charge-coupled device (CCD) video camera may be used.

The signal from the digital camera 1 is fed to an image capture or frame capture electronics 8 mentioned earlier herein. In the embodiment being described, the frame capture electronics 8 are provided on a circuit board (not shown) on which the processor 4 may also be mounted, and the board may be stored in or mounted on the rifle or mounted in a separate container such as a processor container 28 (FIG. 2) which is worn by the user or affixed to a user's garment, such as a belt. The processor container 28 may further comprise a power source, such as a battery, for energizing or providing power to the system 100.

The image or frame capture electronics 8 records at least one or a plurality of frames of data consisting of thousands of pixels at rates of hundreds of times per second. The data is fed to the processor 4 which, in the preferred embodiment of the invention, uses a relatively powerful processor system, such as a 2.4 GHz Intel® Pentium® 4 computer with 512 MB of memory. The image capture electronics 8 can be a Matrox Meteor II frame grabber board available from Matrox Electronic Systems Ltd. of 1055 St. Regis Blvd., Durval QC H9P-2T4 Canada, installed into, for example one of the computer's PCI slots. The board has a bus-mastering mode that can perform data transfers directly into host computer memory without requiring continuous host intervention. In the embodiment being described, the processor 4 comprises suitable memory (not shown) for receiving each frame of data for subsequent processing by the processor 4.

System 100 comprises the image display 3 (FIGS. 1 and 2) that displays the image captured by the camera 1 and which is transmitted via the processor 4. Various types of displays can be used for 3; however the leading choices are the Kaiser Electro-Optics, Inc.'s Proview™ SL35 monocular display, Kopin CyberDisplay™, available from Kopin Corporation 125 North Drive, Westboro, Mass. 01581, and the Sharp Microelectronics 3.5-inch Transflective TFT-LCD display, available from Sharp Microelectronics. The barrel gyrator 6 causes the muzzle or end of barrel 10a to orbit in an elliptical fashion. By altering the weapon's 10 center of mass, the aim of the weapon 10 is caused to sweep out a large oval at the target range. The miniaturized track ball mouse 53 (FIG. 5) is used as part of the user interface 2 to allow the shooter to select the target from a still image presented on the video display 3. Inertial motion sensors 5 are mounted to the weapon 10 and provides angular rate data to the processor 4 as the weapon 10 moves about.

Referring to FIGS. 2 and 3, the barrel gyrator 6 will now be described. In general, the gyrator 6 is to cause to aim point of the weapon or weapon 10 to oscillate or gyrate in a predictable manner to sweep out more target area in a given time than when compared to when the weapon or weapon 10 is held stationary. When a gyrating motion is added to the slow sweeping motion caused by the shooter moving the gun, for example, to follow the target 50, a large swath of the target area is covered in one stroke. If a motion-inducing mechanism, such as the gyrator 6 were not present, the shooter would have to manually align the barrel with the target 50. A disadvantage, as mentioned earlier herein relative to the description of related art, is that depending on the bullet launch tolerance or desired accuracy limits (referred to in block 48 in FIG. 4), this could require an indefinite length of time for the shooter to accurately align the barrel with the target 50 and it may never occur at all during a manual operation.

Thus, one feature of the embodiment being described is that for aiming prediction purposes, the motion of the barrel 10a is controlled, repetitive and lends itself to the mathematical analysis and application of algorithms described herein. Accordingly, a circular or elliptical motion of the barrel 10a is achieved by altering the center of mass of the weapon 10. In the embodiment being described, this is achieved by providing a gyrator 6 having a weight 19 secured thereto with a screw 21. Note that the weight 19 is secured to a cylinder 16 that is rotatably mounted on bearings 22 which are positioned on the barrel 10a. The cylinder 16 is rotatably driven by a motor 17 that is coupled to the ball bearing 22 via a drive linkage, including a pulley 20 and belt or O-ring 18. Preferably, the barrel 10a should rotate in a circular fashion, but it has been found that the barrel 10a actually orbits in an elliptical fashion because of the difference in the moments of inertia of the weapon along different axes, such as axis A illustrated in FIGS. 2 and 3. In the embodiment being described, the weight 19 is mounted to the cylinder 16, which is aluminum in the embodiment being described. The cylinder 16 mounted on the pair of low profile ball bearings 22a at the muzzle end of the weapon barrel 10a. The cylinder 16 comprises a groove 29a that accommodates and receives the belt or O-ring 18 which is attached to the pulley 20 on the small electric motor 17. The cylinder 16 further comprises a pair of diametrically opposed tapped holes used to attach the weight 19, such as a lead weight with the screw 21 as shown. It should be understood that the weight 19 is removable by unscrewing screw 21 so that different mass weights can be fastened to the cylinder 16, which allows the amplitude of the barrels 10a circular motion to be adjustable.

It should be understood that the voltage applied to the motor 17 controls the angular frequency of gyration which is independent of the angle of barrel gyration. The angle of gyration is related to the ratio of the orbital radius to an overall length of the gun or weapon 10, as well as the weapon and gyrator masses. As mentioned, the voltage applied to the motor 17 controls the angular frequency which operates in the 300-800 RPM range in the embodiment being described.

During operation, the shooter points the weapon 10 at the target 50 (FIG. 5) and a button, such as trigger button 57 on user interface 2 is pressed to freeze the image captured by digital camera 1 (FIGS. 1 and 2). The captured image is displayed on the image display 3. At this point, the shooter only needs to provide an approximate aim toward the target 50, such that the camera 1 can capture the image of the target 50. The shooter views the frozen image on the display 3 and positions cross-hairs 51 which are controlled by the miniature mouse 53 (FIG. 5) on the user-interface 2 or display 3 over the desired target 50. The weapon 10 is still pointed approximately at the target 50. The user then puts or actuates an auto-fire mode by pulling a pseudo trigger 10b (FIG. 1).

The processing computer supplies digital I/O signals 32 for system controls. Inputs from the user interface are received by the processing computer 4 to perform such actions as freezing a video frame, select targets 25, and making authorizations for firing 26. To fire, an authorization signal is given to the firing trigger 9 in the form of a binary representation of the time interval between the next camera shutter, available as a digital output from the digital camera 1, and bullet launch time. Numerous digital implementations of an interval timer can be implemented by those conversant with the art. The only requirement is that the clock driving the interval timer be sufficiently fast so that one single tick of the clock is negligible in comparison to the overall system timing. A clock frequency of 10 MHz is sufficient to fulfill this requirement. In the embodiment being described, the pseudo trigger 10b may comprise an electronic as opposed to a traditional mechanical trigger involving a moving firing pin. For example, the electronic firing trigger 9 may comprise the same mechanism resident on the Remington® Model 700 Etronix rifle which is advertised as having an ignition time of 0.27 microseconds available from Remington Arms Company, Inc.

In any event, the pseudo trigger 10b is coupled to the processor 4 which, in turn, generates a signal for energizing the gyrator 6 and, more particularly, motor 17. In response, the energized motor 17 rotatably drives pulley 20 and O-ring 18 to rotatably drive the cylinder 16 to wobble the barrel 10a. As mentioned earlier herein, this causes the aiming direction of the muzzle end of barrel 10a to sweep out an area encompassing the target 50 at the target distance as the shooter slowly sweeps the weapon or gun 10 over the target 50. The speed of the barrel 10a rotation is much faster than the motion caused by the shooter's natural drift, so at some point of an over-target sweep, the barrel 10a points directly or nearly directly at the target 50. Without intervention on the shooter's part, successive image frames are captured by digital camera 1 and processed in accordance with the algorithm described herein. For each frame captured, the target 50 is located in and a series of previous frames are analyzed to predict the anticipated time at which the barrel 10a will point directly at the target 50. At the exact instant when the barrel 10a aligns with the target 50, less the time delay for the trigger mechanism or the delay that the firing trigger 9 will need to actuate and electronically fire the weapon 10, and further compensation for effects such as gravity drop and environmental conditions (e.g., wind, rain, etcetera), the shot is automatically discharged by means of the processor 4 energizing firing trigger 9 (FIG. 1). The projectile or bullet is launched and hits the target 50 in response to the electronic firing of the firing mechanism 9.

A plurality of rate sensors 5 are mounted on weapon 10 (FIG. 1). Solid state rate sensor technology is used in the preferred embodiment of the invention to track the precise angular motion of the gun barrel 10a. That angular rate of change around the X, Y, and Z axes is conveyed to the processor, 4, as shown in FIG. 1.

Although motion of the gun barrel 10a can also be tracked by using video imaging data 31, rate sensors 5 allow for frequent pointing direction sampling with minimal processing demands. In addition, rate sensors 5 offer a straightforward method of separating out angular motion of the gun/video system with actual motion of the target 50, such as the movement of the tank shown in FIG. 6. This is particularly important for firing accuracy at moving targets since target trajectory and ballistic “lead” need to be calculated for a correct ballistic solution.

Rate sensors 5 are mounted on weapon 10 and coupled to processor 4. The rate sensors 5 are small, low cost “gyro on a chip” devices available in a coin size form factor that can be packaged to meet military specification environmental requirements. Example rate sensors that can be used in the invention are the QRS11 series manufactured by Systron Donner Inertial Division of BEI Technologies, Inc. By integrating the rate sensors signal over time, a precise angle of barrel 10a can be calculated. By incorporating two rate sensors into the weapon 10 (one for azimuth angle measurement and the other for elevation) a precise measure of pointing direction can be achieved. If desired, a third rate sensor (not shown) can also be used to monitor rotation about the gun barrel axis.

The rate sensors 5 have a random walk noise that can cause the pointing direction measurement to drift over time. If rate sensors 5 alone are used for tracking the gun's pointing direction 39, the random walk error becomes significant over time and eventually impacts the accuracy of the ballistic solution after a few seconds of time. To counter this effect so as to achieve high accuracy pointing direction information, the invention in one preferred embodiment frequently updates the pointing direction reference from the rate sensors 5 with pointing direction information determined by processing camera video frames 40 of data. The target location is found in each video frame where the FOV of the camera 1 subtends the target. This location is then used to update the angular position of the weapon 10 relative to the target. This updating process can eliminate the gun pointing direction error with respect to the target from growing too large over time.

Rate sensors 5 lower the video processing demands of the invention by allowing the system to operate at much lower frame rates compared to using video alone to determine muzzle pointing direction. Conventional video frame rates of 30 frames per second or lower can typically be used with rate sensors as long as the target trajectory can be resolved for the given frame rate. Faster moving targets demand higher frame rates in order to adequately sample the target trajectory. The rate sensors 5 incorporated into the invention allow the frame rate requirements to depend mostly on the motion of the target as opposed to that of the gun.

Consider the case of shooting at stationary targets with a moving gun. The invention's video tracking technology can update the target location relative to the local image coordinates on every frame time using target pattern matching algorithms 38, 40 applied to the video frames. Once the stationary target's new location is determined within the image, only new rate sensor measurements need to be used to track the gun's pointing direction change relative to the target's location in the video frame. Because only new rate sensor measurements are needed, the random walk drift time is effectively reset, thus providing a mechanism for the pointing direction to remain accurate with respect to the target for an indefinite duration while the target is within the field of view of the camera.

Another case is the challenging task of firing accurately with an angular moving gun and moving target. By using the rate sensors 5 to keep track of the gun's pointing direction position, the rate sensor information can be used to efficiently isolate a target's relative motion from the pointing direction changes of the weapon 10. FIG. 6 shows a graphical example of how target motion relative to a two dimensional inertial reference frame can be calculated in a moving gun and moving target scenario.

At every frame time, the target position can be located by image processing techniques relative to the local image coordinate system. The rate sensor 5 data can be used to track the motion of this image coordinate system relative to an inertial coordinate reference. Since the random walk error of the rate sensor 5 is small in the frame times involved, rate sensor information spanning several frame times can be used for this tracking. By using the combination of the image coordinates of the target with the rate sensor's 5 coordinates for the image frame, the coordinates of the target relative to the inertial frame of reference can be calculated.

When this method is used in combination with range data to the target, it allows for a relatively simple way to calculate the velocity and acceleration vectors of the target moving relative to the inertial reference frame. In these calculations, distance information is derived from target tracking within each video frame and time information is derived from the inverse of the video frame rate. The target's motion vectors can be used to predict the target's new location in between frame times, thus allowing for targeting of high speed targets that are moving at significant displacement rates. For instance, a weapon's 10 lead can be calculated by determining the time of flight to the target based on range and published bullet velocity and then multiply this time by the velocity component perpendicular to the target line of sight. A decision to fire 41 can be made when the pointing direction of the gun is within the ballistic solution necessary to hit the moving target at its predicted future position. This general technique for handling the motion of the target relative to the gun platform does not rely on processing fixed point background references and as a result, the technique can work on both uniform and non-uniform background scenes.

If a weapon's 10 motion is expected to have translation velocity components relative to an inertial frame of reference (e.g. a gun in a moving vehicle), a variety of methods exist (accelerometers 5, global position system references 33, odometer readings 33, etc) to calculate the magnitude and direction of the translational motion. In an additional embodiment of the invention, accelerometers 5 can be incorporated in to the invention to detect changes in translation motion. Translational information can be incorporated into the calculated ballistic solution to further the overall firing accuracy of the system.

The implications of being able to separate gun angular motion from fast relative target motion in a processor aided firing system opens a new realm of firing accuracy when both target and gun are independently moving. This is a significant capability since military guns are typically used from dynamically moving platform (e.g. soldiers, land vehicles, helicopters, airplanes, and boats) and targets are often moving.

Image target identification 38 and tracking 40 is a core technology within the invention that is an important component in supporting processor aided firing. FIG. 9 illustrates a video frame 83 captured by camera 1. By determining the X-Y location of the target in each video frame 83, a relative angular position of the target 81 with respect to the gun's 10 pointing direction 80 at the time of frame exposure can be determined since the location vector 82, which is a vector between the target 50 original position 87 and actual location 85 where the weapon 10 is pointing is known. The gun's pointing direction 80 is known relative to the pointing direction of the digital camera 1 and is shown in FIG. 9 as being the same vector that ends at the center of the field of view of the image frame. Past and present location information can be used to help predict the future location of the target 50 and this prediction is used in the ballistic solution 43 which determines if the processor 4 can fire upon the target at a particular moment in time 41. A variety of different image processing algorithms can be incorporated into the invention for identify and tracking targets in the video scene. This includes the use of raster, vector, and temporal based imaging algorithms in any combination.

An algorithm incorporated into one embodiment of the invention for use in identifying and locating non-changing, fixed targets or reference points within an image frame is a hierarchical search normalized greyscale correlation, also referred to as pyramiding normalized greyscale correlation. This technique is known and is taught, for example, in Matrox Imaging, ActiveMIL version 7 User Guide, Jun. 6, 2002, Matrox Electronic Systems Ltd., pg. 183, which is incorporated herein by reference and made a part hereof. This highly efficient and precise raster based algorithm is effective for locating stationary targets or reference points in situations where there is no relative rotating or scaling occurring within the video scene.

In its most simplified form, this technique scans a reference region of interest over the entire image and then outputting the correlation match at each index step across the image. The index at which the highest correlation exists above a certain threshold is the location of the target in the acquired frame. If all correlation results are below the threshold set for the image, then the frame is deemed to not contain the target.

Normalized grayscale correlation (NGC) is defined by the following equation: γ ( x , y ) = s t [ f ( s , t ) - f _ ( s , t ) ] [ w ( x + s , y + t ) - w _ ] { s t [ f ( s , t ) - f _ ( s , t ) ] 2 s t [ w ( x + s , y + t ) - w _ ] 2 } 1 2
where x=0,1,2, . . . ,M-1, y=0,1,2, . . . ,N-1, {overscore (w)} is the average value of the pixels in the ROI defined by w (this only needs to be computed once). f is the region within the frame image coinciding with the current location of w and {overscore (f)} is the average value within f The summation of the indexes s and t are only over the coordinates that are common to both f and w. The correlation coefficient is calculated for each location (x,y) in the image.

To bring the NGC calculation time into the millisecond domain a pyramid scheme is used where a small lower resolution image of the frame is initially processed. Then, this correlation information is used to limit the pattern search to selected regions of the next higher resolution version of the frame image. This process repeats itself so that the best correlation coefficient is quickly found for the ROI model that is being matched in the image. If the correlation coefficient is below a threshold value then the image is marked as not having the target contained with it.

This pyramid scheme allows NGC processing to only take 3-4 milliseconds using a ROI size of 100×100 pixels and an image size of 640×480. For the image size of 640×1 98 that occurs when the camera is in 240 fps mode the NGC time is about half this (˜1.5 to 2 ms).

NGC is a very powerful pixel level pattern matching technique that can discern the location of virtual any static ROI model from an image frame. NGC proves to be a well performing algorithm to meet this criterion. In future phases of this development, NGC in combination with other image processing algorithms (e.g. geometric pattern matching) can be researched to extend the system beyond static targets to the moving target domain.

In the case of moving targets, vector based algorithms can be more flexible then raster based algorithms. To augment moving target tracking, there are several promising image processing technologies currently known in the art that can be incorporated into preferred embodiment of the invention depending on the particular targeting demands. These technologies can involve identifying and tracking spatial and/or temporal properties of the target within the video frames.

An example of one such method is to use geometric pattern matching algorithms to match edges of a target against a database of target templates. This “model” based approach can be used to track the target (such as the tank shown in FIG. 6) as it moves across a scene. Geometric techniques tend to be more robust in handling the scaling and rotation effects of moving platforms, but generally demand more CPU processing compared to raster based correlation techniques. It should be understood that the incorporation of rate sensors into the invention, reduces the overall processing demands so geometric pattern matching and other powerful image processing techniques can be used to accurately track the target given a variety of scene conditions.

Even with the fastest electronics and processing speeds, there is an inescapable time delay between image capture and the launch of the projectile or bullet. The recent motion of the barrel 10a and prediction 45 of its future position needs to be determined. Each frame of data captured by camera 1 is analyzed to find the position where the barrel 10a points. When the barrel 10a is anticipated to point at the target, or lead point in the case of moving targets, within a certain tolerance, the bullet is launched 47.

The analysis is considered in two fashions; passive mode, when the gyrator 6 is off and active mode, when the gyrator 6 is on. Each case produces a characteristic sequence of barrel pointing.

With the gyrator off, the barrel wander from human drift behaves much like Brownian motion on a long time scale of the order of seconds. The motion is, for all intents and purposes, random and therefore non-predictable. However, on a short time scale of the order of milliseconds the location difference between successive data points is not that great and leads itself to predictive techniques.

One firing method is to wait until the barrel 10a is pointed at the target, within a certain tolerance, and then launch the bullet. This can be viewed as a non-forecasting approach and gives performance within a few MOA. However, the preferred rate prediction technique uses the rate of change of the X and Y components of motion to predict future location coordinates. The two main concepts of this approach are 1) actual X, Y location prediction based on the rate of change, and 2) the values of the first and second derivatives used as constraints when near the target. The complete calculation uses data from the 3 previous frames to make a prediction on whether to fire the rifle 1.5 to 2.5 frames in the future. The X, Y coordinate location is defined in the image plane of the target where 0, 0 is the target center.

The procedure is illustrated in FIG. 6 and is as follows. First, vectors 60 and 61 are determined. These vectors 60 and 61 correspond to a location of target relative to center of image frame. A pointing direction of gun/camera 1 relative to rate sensor's 5 inertial frame of reference 67 is then determined to provide a calculation of vectors 62 and 63 in FIG. 6. Next, vectors 64 and 65 are calculated indicating the position of target relative to the inertial frame of reference 67. Processor 4 then determines a resultant displacement vector 66 of target during time delta t relative to the inertial frame 67. The velocity vector 66 for the target can then be calculated from this information.

Thus, first derivatives are calculated for each X and Y component. The predicted X component is then simply calculated using X=(X_last+1.5*X_rate)+(t* X_rate) where t=0 to 1. A similar formula is used to calculate the Y component. A second derivative term can also be added to this prediction, but requires more investigation on whether it is actually helpful since it is based on slightly older frame data.

As the pointing direction of the rifle 10 changes, it sweeps out a path in the image plane of the target. This path will approach the target center and then move away as it sweeps by. Because there is a variance between predicted motion and actual motion, it is best to make firing decisions using data that is representative of the motion moving toward the target and not away from it. Such a decision can be made by using constraint conditions based upon the pointing angle distance from the target and the rates of change of this distance.

The distance from the target center defined in the plane of the target is calculated from the Pythagorean Theorem, using the X and Y components. After calculating the distance from the target in each of the 3 previous frames, the first and second derivatives of the distance from the target center can be obtained for the most recent frame in the set of three. The distance from target and the first and second derivatives of this distance allow for the setup of powerful constraint conditions on when to fire the rifle.

In the case where the gyrator is on, the barrel 10a motion is the combination of the Brownian-like motion arising from human drift and the elliptical motion from the gyrator and is illustrated in FIG. 7. Data points on this figure are made by taking the recent frames of image data and finding, by pattern matching or other means, the point where the previously selected target is located. Points at this stage are given in pixel units as X and Y coordinates in a Cartesian coordinate plane.

The unequal lengths of the major and minor axes arise from the asymmetrical moment of inertia of the weapon. The vertical and horizontal components each trace out a sinusoid that wanders about due to human drift.

In the preferred embodiment, the drifting sinusoid 72 of each component can be modeled as the addition of a polynomial and sine wave, i.e.,
f(t)=(A0+A1t+A2t2+ . . . )+B0 sin(B1t+B2)
where An and Bn are coefficients that can be determined by curve fitting techniques. The polynomial factor estimates the motion caused by human drift, and the sine function models the gyration motion. As an example, The X (horizontal) raw data from FIG. 7 is shown in FIG. 8.

To do the actual coefficient determination, a couple of non-linear fitting routines that minimize CHI square can be used; a simple Bevington grid search method along with a more sophisticated Levenburg-Marquardt program. However, the difficulty with non-linear linear routines is that to some extent they have to “fish” for a solution by probing in a trial-and-error fashion. Worse, there may be several solutions that minimize CHI-square, but are only locally valid, and miss the absolute minimum solution. In addition, if the coefficients are not seeded with numbers close to their ultimate values, the routines will not converge on the true solution and sometimes “fly away”.

To determine the coefficients of the motion equation, the following steps are taken. Decompose the positional data into X (horizontal) and Y (vertical) functions of time. A cubic spline fit 71 (FIG. 8) is fit to each component. From this the extrema (min and max) and inflection points of the sinusoid are extracted by processor 4. A min and max difference is used to seed the amplitude of the sine coefficient, B0. The time difference between successive inflection points (or extrema) is then used to seed the angular frequency coefficient, B1. The last inflection point is used to seed the phase, B2, or let it free float. A polynomial fit is then performed on the set of inflection points. This expresses the human drift is now removed in a polynomial format and determines coefficients An. This leaves a function representing movement of the target. Alternately, a fit can be made to the extrema points. The human drift from the data by subtracting out the polynomial function. A non-linear grid search is used to determine exactly the sinusoidal coefficients, Bn. To improve the accuracy of the prediction, the last few data points are artificially weighted by assigning artificially small standard deviations to the data points.

At this point, the coefficients of An and Bn are determined by processor 4 and thus the expected barrel 10a position can be evaluated by the drifting sinusoid, f(t) 72, in the X coordinate. The Y coordinate is fit in a similar manner.

The invention uses several techniques to handle accurately aiming and firing on moving targets. Target Identification algorithms operate on each frame of video data to identify and track the target in real time. Rate sensors 5 are used to separate out gun motion from target motion, so the anticipated trajectory of the target can be calculated by processor 4. This processing is done by the processor or ballistic computer(X) 4. Location of the image in the X-Y plane of the field of view can be readily obtained by this method.

Since the time of flight of a ballistic round can be many milliseconds even at short ranges, a “lead” on a moving target is usually necessary to hit a fast moving target. The invention can calculate the “lead” necessary for hitting a moving target by determining the target's trajectory using the image based tracking techniques described earlier relative to FIG. 6 and by determining the target's range.

Range determination 36 can be done using a variety of techniques andior sensors to determine a range or distance to the target, such as target 50 in FIG. 9. For example, if the size of the target is known, range can be calculated by using image data associated with the target and measuring the pixel width of the target and using the below formula:
D=s/tan[(Tpw*FOV)/Ipw]
Where D is distance to target, s is physical target width, Tpw is target width in pixels, Ipw is Image size in pixels, and FOV is effective angular Field of View of imaging system. A range sensor 34 FIG. 4 may also be optionally directly incorporated into the invention to determine range. Its output is feed into the ballistic computer processor 4 and used to calculate the ballistic solution. This “lead” factors into a ballistic solution 43 represented in FIG. 4 that corresponds to 43, 44. The ballistic solution 43 is used, along with other factors such as windage, gravity drop, ammunition, gun type, environmental effects by processor 4, to determine the angular position the gun needs to be in before the processor 4 energizes the firing trigger 9 to fire the weapon 10.

Depending on the particular target and scene requirements, the invention can use a variety of image processing techniques to track a moving target in successive video frames. Geometric Pattern matching is one example of such a technique that can account for both scaling and rotation of the moving target.

The choice of lens and camera 1 to incorporate in to the invention can be dependent on a variety of factors including the particular tracking requirements of target scene. For example, for fast moving targets, an imaging system with larger aperture optics, a wider field of view lens, high frame rate video, and a higher resolution imager will often be necessary to simultaneously lead, track, and resolve the target.

The video camera 1 used in the invention has an electronic shutter capability to adjust the exposure time of each video frame. This is particular important in both the case of the static and moving target since in either case the gun and/or the target can be moving. A short exposure time is often needed to avoid blurring of the target image in the video frame. Typical shutter settings of 1/500 to 1/5000 of a second are common depending on the motion, lighting, and the f# of the lens.

The operation will now be described from the shooter's point of view, the sequence of operations is as follows:

    • 1) The shooter points the weapon 10 at the target and a button 57 is pressed to freeze the image captured by the digital camera 1, FIG. 2, which is rendered on the image display 3, FIG. 2. At this point, only an approximate aim is necessary.
    • 2) The shooter views the frozen image on the display 3, FIG. 2, and positions crosshairs controlled by the mouse 53 (FIG. 5) in the user interface 2, FIG. 2, over the desired target. The weapon 10 is still pointed approximately at the target.
    • 3) The weapon 10 is then put in auto-fire mode by pulling a pseudo trigger 10b (FIG. 1).
    • 4) The barrel gyrator 6, FIG. 2, activates and wobbles the barrel 10a, which causes its aiming direction to sweep out a sizeable area at the target distance as the shooter slowly sweeps the weapon 10 over the target. The speed of the barrel rotation is much faster than the motion caused by the shooter's natural drift, so at some point of an over-target sweep, the barrel points directly, or nearly directly, at the target.

Without intervention on the shooter's part, successive image frames are captured by camera 1 and processed by frame capture electronics 8. For each frame of data captured, the target is located, and a series of previous frames are analyzed to predict the anticipated time at which the barrel will point directly at the target in the manner described relative to FIGS. 6-9.

    • 5) At the exact instant when the barrel 10a aligns with the target, less the time delay for the trigger mechanism, and compensated for effects such as gravity drop and environmental conditions, the shot is automatically discharged by means of the trigger mechanism 9, FIG. 1.
    • 6) Finally, the projectile is launched and hits the target.

As alluded to earlier, the system can work without the barrel gyrator. This decreases the system performance and forces a barrel/target alignment to fire. In the case where a target is static, and has a stationary background, it is possible to operate the system without motion detectors. In this scenario, the optical identification part of this system is sufficient to fire accurately.

The digital camera 1 is typically a conventional visible light imaging device, but thermal Infrared and night vision imaging devices can also be used with the invention.

Automatic Target identification is an additional embodiment of the device that can allow the use to select targets that are identified by computer algorithms. Once identified using predefined criteria, they are presented to the user as possible targets on the image display 3.

Alternatively, the system 100 can work without a rotating muzzle gyrator 6 in cases where is there is ample time to direct the weapon directly at the target. The above steps 1-3 are performed: a sighting is made, image captured, and crosshairs placed on the target. Then the barrel 10a is aimed at the target and slowly wandered about. When the barrel 10a is pointed directly at the target, less corrections, the processor 4 energizes firing trigger 9 to cause the round to be automatically fired.

This invention augments a shooter's ability to pull the trigger at the exact instant to hit a target by substituting an electronic means. The ramifications are profound, in that it may enable the shooter to increase the change he will hit his target virtually every time.

A system flow processing diagram of the preferred embodiment is shown in FIG. 10. Digital video 90 is acquired of the target scene by the digital camera 1 and is then transferred to the computer processor 4. Once a user sees a target of interest on the display 3, a single frame of the video can be captured and displayed on the image display screen 3 for target selection (block 92). The user selects 92 the target by moving a cursor to define the target with region of interest box or similar marking method. Once the user has selected the target, an authorization to shoot 93 is made by the user by depressing a button or similar input control on the user interface.

With the target selected, the computer processor 4 uses the targets unique image properties to find the target in subsequent video frames as they are acquired by the digital camera 1. The identification and location is done using pyramiding Normalized Greyscale Correlation, Geometric Pattern Matching, Image differencing and other image processing techniques currently known in the art. Once a new frame of the video is acquired 94, the target location is found in the frame 96. This location is used to determine the relative pointing direction of the target relative to the current position of the gun.

The current gun pointing direction can be determined by integrating the inertial motion rate sensor signals to determine pointing direction at any instant of time relative to an inertial frame of reference 95. The inertial motion rate sensors can give measurement of gun pointing direction independent of imaging data and can allow for high frequency sampling of gun pointing direction that can exceed the frequency of target tracking using video frame rate measurements.

Prediction of gun pointing direction can be made using inertial motion sensor data 95 and in the case of a stationary targets by also using the target tracking data 96. However, in the general embodiment of the system 100, inertial motion sensor data is the primary data source for gun pointing direction information due to the low processing demands, high bandwidth performance, and other system benefits of this method of acquisition.

In the preferred embodiment, the algorithms used to predict future gun pointing direction depends on if the barrel is gyrating or not 97 (decision block 97). A rate of change method 98 is used in the case of no barrel gyration and a drifting sinusoid method 99 is used in the case when barrel gyration is occurring. The prediction of gun pointing direction needs to typically be several to many milliseconds in the future to account for data acquisition time, processing time, trigger time, ignition time, and round bore time between when the pointing data was acquired and the time when muzzle exit could occur.

If the target is not moving (block 100), prediction of actual target position is not needed since the target is always stationary. In the non-moving case 100, the system then calculates (block 103) range to target and proceeds to the ballistics calculation 104 to determine if the gun is currently at the correct aim point to hit the target.

In the moving target case 100, both the range to target 101 at time of bullet impact and the gun lead 102 required at round muzzle exit to hit the target needs to be predicted. Since true target trajectory needs to be known for these calculations, gun pointing information obtained from the inertial rate sensors is used to isolate target motion in the video tracking information from that of weapon 10 motion. Once target trajectory is known, predicted range and lead can be calculated by processor 4 in the course of calculating the general ballistic solution 104.

Once a ballistic solution 104 is determined, the system 100 evaluates whether the weapon 10 is correctly aiming correctly to hit the target 107. The ballistic solution can incorporate other sensor data 105 and gun/ammunition performance 106 data. If the weapon's 10 aim point is with the accuracy threshold configured in the system by the user a countdown timer 108 will be set to fire 109 at precisely the time indicated by ballistic calculations at block 104. In the preferred embodiment, the firing is by electrical means, via firing trigger 9 (FIG. 1) to allow for minimal ignition delays.

If the weapon 10 aim point is currently not at the correct position 107, the system 100 loops back to tracking gun and target motion to collect new gun pointing data and/or image tracking data.

There are several modes in which this invention can operate and it is not essential for all the system 100 components be functioning at the same time. The more operating components that art functioning, however, the more the increase in capability. FIG. 11 diagrams the increasing system performance with the exercise of more system components.

The essential group of components is the digital camera 1 and the basic system items; user interface 2, image display 3, processor 4, and frame capture electronics 8. With this configuration, each digital image frame is analyzed allowing a determination of the barrel direction and a prediction of when the weapon will point directly at the target. In this mode, the shooter manually sweeps the gun about until the gun alligns with the target. This enables the hitting of staionary targets with great (beyond human) accuarcy without the need of a rigid weapon support.

With the addition of the barrel gyrator 6, the weapon is mechanically oscillated which causes the sweeping of the aim point to cover much more area in a given amount of time. One way to visualize this is that instead of a precise aiming point, a “spotlight”, is being swept about around the target area. This causes the weapon to find it's target much faster, and can be used with stationary and moving targets.

Finally, with the addition of motion sensors 5, the instantaneous barrel motion can be detected and integrated over time to provide a barrel pointing direction. This information can be used to frequently determine barrel position independently of image data. As a result, the accuracy of the system is enhanced. In addition, the information can be used to efficiently target both stationary and moving targets.

Additional Considerations

1. Night Vision and Processor Aided Firing of Small Arms

Up until now, the simulated shooting of the processor aided firing of small arms system has been performed under daylight conditions. The obvious extension of is to carry out experiments with technologies that enable nighttime warfare. At this point, it is proposed that a survey of existing battlefield cameras be performed. These can be thermal sensors or light amplification types. Candidate sensors could be first checked by studying their performance data, and if practical, placed in the experimental system. Actual testing can then be performed in the field.

2. An Anti-Fratricide System and Processor Aided Firing of Small Arms

Another modification of this system is to equip every squad member with an accurate Global Positioning System (GPS), digital compass (e.g., Honeywell HMR 3000 Digital Compass Module) on their weapon, and WLAN radio. The position of each squad member is shared with every other member by the WLAN. By having the position of all other “friendlys,” it is immediately known by reading the bearing and inclination from the digital compass if the weapon is pointed at a squad member. At this point, an alarm (not shown) can alert the soldier or, when incorporated into the Processor Aided Firing of Small Arms system, inhibit the firing of the weapons.

While the methods herein described, and the forms of apparatus for carrying these methods into effect, constitute preferred embodiments of this invention, it is to be understood that the invention is not limited to these precise methods and forms of apparatus, and that changes may be made in either without departing from the scope of the invention disclosed herein.

Claims

1. A weapon comprising:

a firearm having a barrel and a user interface;
a barrel oscillator for oscillating the barrel in a predetermined pattern;
an image capture device mounted on said firearm for capturing a plurality of image frames of a target and generating image data in response thereto;
at least one motion sensor mounted on said firearm for sensing a motion of the barrel and generating motion data in response thereto; and
a processor coupled to said user interface, said image capture device and said at least one motion sensor;
said processor enabling a user to select a target and in response thereto, causing said image capture device to capture said plurality of images and generate said image data which is used along with said motion data to determine a predicted target location and coverage point where said barrel covers said target upon which said processor may energize said firearm to fire a projectile.

2. The weapon as recited in claim 1 wherein said user interface comprises a target selector locating a cross-hair on said target.

3. The weapon as recited in claim 2 wherein said target selector comprises a track ball for positioning said cross-hair.

4. The weapon as recited in claim 1 wherein said user interface comprises a trigger mechanism for receiving a fire signal from said processor and for implementing a projectile launch signal in response thereto.

5. The weapon as recited in claim 1 wherein said firearm comprises a gun or rifle.

6. The weapon as recited in claim 1 wherein said image capture device comprises a charge-coupled display.

7. The weapon as recited in claim 1 wherein said processor comprises at least one algorithm that determines a velocity vector for said target by determining a location of said target relative to a center of an image frame in a plurality of said plurality of image frames and calculating a plurality of position vectors in response thereto and using said position vectors to determine a displacement vector which in turn is used to calculate a velocity vector for said target.

8. The weapon as recited in claim 1 wherein said processor comprises barrel position detection algorithm for detecting a position of said barrel.

9. The weapon as recited in claim 8 wherein said barrel position detection algorithm receives said image data and processes said image data by applying a cubic spline fit to provide a drifting sinusoid generally corresponding to a movement of said barell.

10. The weapon as recited in claim 7 wherein said barrel position detection algorithm receives said image data and processes said image data by applying a cubic spline fit to provide a drifting sinusoid generally corresponding to a movement of said barrel.

11. The weapon as recited in claim 1 wherein said processor generates said predicted location of said target in response to said image data and said motion data.

12. The weapon as recited in claim 1 wherein said barrel oscillator comprises a gyrator.

13. The weapon as recited in claim 12 wherein said gyrator comprises:

a bearing mounted on said barrel;
a drive motor coupled to said bearing in response to a drive signal from said processor, said drive motor rotatably driving said bearing to cause said barrel to gyrate in a predetermined manner.

14. The weapon as recited in claim 13 wherein said user interface comprises an electronic button for initiating a gyration sequence during which said processor generates said drive signal.

15. A weapon comprising

a firearm comprising a barrel;
an imager mounted to said barrel for capturing an image of a target area;
a user interface for displaying said image, said user interface comprising a trigger for selecting a target within said image area; and
a processor coupled to said user interface and said imager for determining a future target location of said target and for automatically firing said firearm when said barrel is positioned in a firing position such that a projectile discharged from said firearm will hit the target selected by the user.

16. The weapon as recited in claim 15 wherein said weapon further comprises:

an electronic firing trigger coupled to said processor for firing the weapon.

17. The weapon as recited in claim 15 wherein said user interface further comprises a firing authorization trigger coupled to said processor for enabling a user to authorize firing the firearm after a target has been selected but before said automatic firing of said firearm.

18. The weapon as recited in claim 15 wherein said weapon further comprises:

at least one motion sensor coupled to said processor for sensing an angular position of said barrel;
said processor comprising a barrel tracking algorithm for receiving said angular position and for predicting a future barrel position for said barrel; said processor generating

19. The weapon as recited in claim 15 wherein said processor comprises video processing algorithm for receiving a plurality of images from said imager and for predicting said future target location of said target in response thereto.

20. The weapon as recited in claim 18 wherein said processor comprises video processing algorithm for receiving a plurality of images from said imager and for predicting said future target location of said target in response thereto.

21. The weapon as recited in claim 15 wherein said processor comprises video processing algorithm for receiving a plurality of images from said imager and for predicting said future target location of said target as well as a future barrel position in response thereto.

22. The weapon as recited in claim 15 wherein said weapon further comprises a gyrator mounted to the barrel for gyrating the barrel in a generally consistent motion.

23. The weapon as recited in claim 21 wherein said gyrator comprises:

a bearing mounted on said barrel;
a drive motor coupled to said bearing in response to a drive signal from said processor, said drive motor rotatably driving said bearing to rotate about said barrel to cause said barrel to gyrate in said generally consistent motion.

24. The weapon as recited in claim 23 wherein said gyrator comprises a weight mounted to said bearing.

25. The weapon as recited in claim 15 wherein said firearm comprises a gun or rifle.

26. The weapon as recited in claim 15 wherein said imager is a digital camera.

27. The weapon as recited in claim 15 wherein said imager is a charge-coupled display.

28. A gyrator for gyrating a barrel of a firearm, said gyrator comprising:

a bearing for mounting on said barrel of said firearm; and
a drive motor coupled to said bearing for rotatably driving said bearing to cause said end of said barrel to gyrate.

29. The gyrator as recited in claim 28 wherein said gyrator a weight mounted to said bearing.

30. A weapon comprising:

a firearm comprising a barrel; a gyrator mounted on said barrel for gyrating said barrel in a consistent motion;
an imager mounted to said firearm for capturing a plurality of images of an area;
a user interface for displaying at least one of said plurality of images, said user interface comprising a trigger for selecting a target within said at least one of said plurality of images; and
a processor coupled to said user interface, said imager and said gyrator, said processor receiving image data corresponding to on or more of said plurality of images captured causing said firearm to automatically discharge a projectile from said firearm when said barrel is positioned in a firing position such that said will hit the target selected by the user.

30. The weapon as recited in claim 30 wherein said weapon further comprises:

an electronic firing trigger coupled to said processor for firing the weapon.

32. The weapon as recited in claim 30 wherein said weapon comprises:

a user interface coupled to said processor for displaying said at least one of said plurality of images.

33. The weapon as recited in claim 32 wherein said user interface further comprises a firing authorization trigger coupled to said processor for enabling a user to authorize firing the firearm after a target has been selected but before said automatic firing of said firearm.

34. The weapon as recited in claim 30 wherein said weapon further comprises:

at least one motion sensor coupled to said processor for sensing an angular position of said barrel;
said processor comprising a barrel tracking algorithm for receiving said angular position and for predicting a future barrel position for said barrel.

35. The weapon as recited in claim 30 wherein said processor comprises video processing algorithm for receiving data corresponding to said plurality of images and for predicting said future target location of said target in response thereto.

36. The weapon as recited in claim 34 wherein said processor comprises video processing algorithm for receiving image data corresponding to said plurality of images from said imager and for predicting said future target location of said target in response thereto.

37. The weapon as recited in claim 30 wherein said processor comprises video processing algorithm for receiving image data corresponding to said plurality of images from said imager and for predicting said future target location of said target as well as a future barrel position in response thereto.

38. The weapon as recited in claim 30 wherein said gyrator comprises:

a bearing mounted on said barrel;
a drive motor coupled to said bearing in response to a drive signal from said processor, said drive motor rotatably driving said bearing to rotate about said barrel to cause said barrel to gyrate in said generally consistent motion.

39. The weapon as recited in claim 38 wherein said gyrator comprises a weight mounted to said bearing.

40. The weapon as recited in claim 30 wherein said firearm comprises a gun or rifle.

41. The weapon as recited in claim 30 wherein said imager is a digital camera.

42. The weapon as recited in claim 30 wherein said imager is a charge-coupled display.

43. An automatic firing system for use with a firearm, comprising:

an image capture device mounted on said firearm for capturing a plurality of images of an area in front of a muzzle end of said firearm; and
a processor coupled to said image capture device for processing data associated with said plurality of images and for determining an optimum firing time to discharge a bullet from said firearm in order to hit a target selected by a user.

44. The automatic firing system as recited in claim 43 wherein said system further comprises:

a gyrator for gyrating an end of a barrel of said firearm while said image capture device captures said plurality of images.

45. The automatic firing system as recited in claim 43 wherein said automatic firing system further comprises:

at least one motion sensor coupled to said processor for sensing an angular position of said firearm;
said processor comprising a tracking algorithm for receiving said angular position and for predicting a future position of said muzzle end in response thereto.

46. The automatic firing system as recited in claim 43 wherein said processor comprises video processing algorithm for receiving said data corresponding to said plurality of images and for predicting a future target location of said target in response thereto.

47. The automatic firing system as recited in claim 45 wherein said processor comprises video processing algorithm for receiving said data corresponding to said plurality of images and for predicting a future target location of said target in response thereto.

48. The automatic firing system as recited in claim 43 wherein said processor comprises video processing algorithm for receiving image data corresponding to said plurality of images from said imager and for predicting said future target location of said target as well as a future barrel position in response thereto.

49. A method for increasing accuracy of hitting a target with a firearm, said method comprising the steps of:

capturing a plurality of images of a target area including the target;
processing said plurality of images to predict an optimum firing condition; and
discharging the firearm when said optimum firing condition is achieved.

50. The method as recited in claim 49 wherein said optimum firing condition is when a muzzle end of said barrel covers or leads said target such that when a projectile is discharged from the firearm, it will hit the target.

51. The method as recited in claim 49 wherein said method further comprises the step of:

using image data points generally corresponding to a muzzle end of said firearm to determined said optimum firing condition.

52. The method as recited in claim 49 wherein said method further comprises the step of:

causing a barrel of said gun to move during said capturing step.

53. The method as recited in claim 49 wherein said method further comprises the step of:

using a plurality of motion sensors to determine a position of said target and a position of a muzzle end of said barrel in order to determine said optimum firing condition.

54. The method as recited in claim 49 wherein said method further comprises the steps of:

determining a first function representing a position of said target and a position of a muzzle end of said barrel;
determining a second function representing a barrel position of said barrel;
determining a difference between said first function and said second function to determine a location of said target.

55. The method as recited in claim 54 wherein said method further comprises the step of:

using a plurality of motion sensors to provide position data for use by a processor to calculate said first and second functions.

56. The method as recited in claim 54 wherein said method further comprises the step of:

using image data from said at least one of said plurality of images to calculate said first and second functions.

57. The method as recited in claim 49 wherein said capturing step is performed with a digital camera mounted to said firearm.

58. The method as recited in claim 54 wherein said method further comprises the step of:

providing a user interface on said gun for enabling a user to select the target from a video display.

59. The method as recited in claim 58 wherein said user interface comprises a track ball for placing cross hairs on said target.

60. The method as recited in claim 59 wherein said method further comprises the step of:

performing said capturing and processing steps while said target is moving.

61. The method as recited in claim 52 wherein said method further comprises the step of:

performing said capturing and processing steps while said target is moving.

62. The method as recited in claim 61 wherein said method further comprises the step of:

performing said causing step using a barrel gyrator.

63. The method as recited in claim 21 wherein said method further comprises the step of:

causing a barrel of said gun to move during said capturing step.

64. A firing system that automatically launches the projectile comprising of:

a) a barreled firearm,
b) an electronic digital camera that supplies the electronic processor with rapid, digital, repetitive frame information,
c) motion sensors that supply the electronic processor with angular rate information,
d) an electronic display, which is able to display the image data from the digital camera and display cross hairs for target identification,
e) a computer mouse, which interacts with the electronic processor and is able to position the cross hairs to identify the desired target,
f) an electronic processor, that receives data from the electronic digital camera, motion sensors, computer mouse, and transmits images to the electronic display, executes barrel prediction algorithms while analyzing the motion generated by human drift and mechanically forced motion from the barrel gyrator and finally transmits a fire signal to the trigger mechanism,
g) a trigger mechanism, which implements the projectile launch signal generated by the electronic processor,
h) a barrel gyrator, which forces an orbital motion on the firearm, which is analyzed by the electronic processor.

65. The said barreled firearm of claim 64 is a rifle.

66. The said barreled firearm of claim 64 is a pistol.

67. The said electronic digital camera of claim 64 is a Charge Coupled Device camera.

68. The said electronic display of claim 64 is a monocular type positioned over the eye.

69. The said computer mouse of claim 64 is a miniature trackball.

70. The said electronic processor of claim 64 is capable of calculating and correcting for target range and atmospheric conditions.

71. The said electronic processor of claim 64 is a microprocessor system.

72. The said electronic processor of claim 64 is programmable logic circuitry.

73. The said barrel gyrator of claim 64 is a motor with an eccentric weight attached on its shaft.

74. The said trigger mechanism of claim 64 electrifies an electrically-ignited cartridge.

75. A firing system that automatically launches the projectile comprising of:

(a) a barreled firearm;
(b) an electronic digital camera, that supplies the electronic processor with digital rapid, repetitive frame information;
(c) an electronic display, which is able to display the image data from the digital camera and display cross hairs for target identification;
(d) a computer mouse, which interacts with the electronic processor and is able to position the cross hairs to identify the desired target;
(e) an electronic processor, that receives data from the electronic digital camera, computer mouse, transmits the data to the electronic display, and runs barrel prediction algorithms while analyzing the motion generated by human drift and transmits a fire signal to the trigger mechanism; and
(f) a trigger mechanism, which implements projectile launch by a signal from the electronic processor.
Patent History
Publication number: 20060005447
Type: Application
Filed: Sep 10, 2004
Publication Date: Jan 12, 2006
Applicant:
Inventors: Gerald Lenner (Tinton Falls, NJ), Philip Karcher (Wall, NJ)
Application Number: 10/938,321
Classifications
Current U.S. Class: 42/111.000
International Classification: F41G 1/00 (20060101);