Sensing device for firearm laser training system and method of simulating firearm operation with various training scenarios

A firearm laser training system according to the present invention accommodates various types of targets for facilitating a variety of firearm training activities. The system employs an image sensing device to detect laser beam impacts on a target, where the laser beam is projected from an actual or simulated user firearm with a laser transmitter. The image sensing device compensates for image distortions and the sensing device viewing angle with respect to the intended target, and enables detection of laser beam impacts on various types of targets (e.g., paper targets, projected targets, videos, still or moving images, etc.) to provide firearm training with varying scenarios. The image sensing device provides impact location information to a computer system to graphically display the impact locations and provide information pertaining to user performance of the training activity.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority from U.S. Provisional Patent Application Ser. No. 60/752,586, entitled “Sensing Device For Firearm Laser Training System and Method of Simulating Firearm Operation With Various Training Scenarios” and filed Dec. 22, 2005, the disclosure of which is incorporated herein by reference in its entirety.

BACKGROUND OF THE INVENTION

1. Technical Field

The present invention pertains to firearm training systems, such as those disclosed in U.S. Pat. No. 6,322,365 (Shechter et al.), U.S. Pat. No. 6,616,452 (Clark et al.) and U.S. Pat. No. 6,966,775 (Kendir et al.) and U.S. Patent Application Publication Nos. 2002/0197584 (Kendir et al.) and 2005/0153262 (Kendir), the disclosures of which are incorporated herein by reference in their entireties. In particular, the present invention pertains to a firearm laser training system that accommodates various types of targets for facilitating a variety of firearm training activities.

2. Discussion of Related Art

Firearms are utilized for a variety of purposes, such as hunting, sporting competition, law enforcement and military operations. The inherent danger associated with firearms necessitates training and practice in order to minimize the risk of injury. However, special facilities are required to facilitate practice of handling and shooting the firearm. These special facilities tend to provide a sufficiently sized area for firearm training and/or confine projectiles propelled from the firearm within a prescribed space, thereby preventing harm to the surrounding environment. Accordingly, firearm trainees are required to travel to the special facilities in order to participate in a training session, while the training sessions themselves may become quite expensive since each session requires new ammunition for practicing handling and shooting of the firearm.

In addition, firearm training is generally conducted by several organizations (e.g., military, law enforcement, firing ranges or clubs, etc.). Each of these organizations may have specific techniques or manners in which to conduct firearm training and/or qualify trainees. Accordingly, these organizations tend to utilize different types of targets, or may utilize a common target, but with different scoring criteria. Further, different targets may be employed by users for firearm training or qualification to simulate particular conditions or provide a specific type of training (e.g., grouping shots, hunting, clay pigeons, etc.).

The related art has attempted to overcome the above-mentioned problems by utilizing laser or light energy with firearms to simulate firearm operation and indicate simulated projectile impact locations on targets. For example, U.S. Pat. No. 4,164,081 (Berke) discloses a marksman training system including a translucent diffuser target screen adapted for producing a bright spot on the rear surface of the target screen in response to receiving a laser light beam from a laser rifle on the target screen front surface. A television camera scans the rear side of the target screen and provides a composite signal representing the position of the light spot on the target screen rear surface. The composite signal is decomposed into X and Y Cartesian component signals and a video signal by a conventional television signal processor. The X and Y signals are processed and converted to a pair of proportional analog voltage signals. A target recorder reads out the pair of analog voltage signals as a point, the location of which is comparable to the location on the target screen that was hit by the laser beam.

U.S. Pat. No. 5,281,142 (Zaenglein, Jr.) discloses a shooting simulation training device including a target projector for projecting a target image in motion across a screen, a weapon having a light projector for projecting a spot of light on the screen, a television camera and a microprocessor. An internal device lens projects the spot onto a small internal device screen that is scanned by the camera. The microprocessor receives various information to determine the location of the spot of light with respect to the target image.

U.S. Pat. No. 5,366,229 (Suzuki) discloses a shooting game machine including a projector for projecting a video image that includes a target onto a screen. A player may fire a laser gun to emit a light beam toward the target on the screen. A video camera photographs the screen and provides a picture signal to coordinate computing means for computing the X and Y coordinates of the beam point on the screen.

International Publication No. WO 92/08093 (Kunnecke et al.) discloses a small arms target practice monitoring system including a weapon, a target, a light-beam projector mounted on the weapon and sighted to point at the target and a processor. An evaluating unit is connected to the camera to determine the coordinates of the spot of light on the target. A processor is connected to the evaluating unit and receives the coordinate information. The processor further displays the spot on a target image on a display screen.

The systems described above suffer from several disadvantages. In particular, the Berke, Zaenglein, Jr. and Suzuki systems employ particular targets or target scenarios, thereby limiting the types of firearm training activities and simulated conditions provided by those systems. Further, the Berke system utilizes both front and rear target surfaces during operation. Thus, placement of the target is restricted to areas having sufficient space for exposure of those surfaces to a user and the system. In addition, the Berke and Kunnecke et al. systems merely display impact locations to a user, thereby requiring a user to interpret the display to assess user performance during an activity. The user assessment is typically limited to the information (impact locations) provided on the display, thereby restricting feedback of valuable training information to the user and limiting the training potential of the system.

OBJECTS AND SUMMARY OF THE INVENTION

Accordingly, it is an object of the present invention to accommodate various types of targets within a firearm laser training system to conduct varying types of training, qualification and/or entertainment activities.

It is another object of the present invention to employ an image sensing device with a firearm laser training system that detects beam impact locations on a target and compensates for various orientations and viewing angles of the device relative to the target.

Yet another object of the present invention is to employ user-specified targets within a firearm laser training system to conduct desired training procedures.

A further object of the present invention is to assess user performance within a firearm laser training system by determining scoring and/or other performance information based on detected impact locations of simulated projectiles on a target.

The aforesaid objects may be achieved individually and/or in combination, and it is not intended that the present invention be construed as requiring two or more of the objects to be combined unless expressly required by the claims attached hereto.

According to the present invention, a firearm laser training system accommodates various types of targets for facilitating a variety of firearm training activities. The system employs an image sensing device to detect laser beam impacts on a target, where the laser beam is projected from an actual or simulated user firearm equipped with a laser transmitter. The image sensing device compensates for image distortions and the sensing device viewing angle with respect to the intended target, and enables detection of laser beam impacts on various types of targets (e.g., paper targets, projected targets, videos, still or moving images, etc.) to provide firearm training with varying scenarios. The image sensing device provides impact location information to a computer system to graphically display the impact locations and provide information pertaining to user performance of the training activity.

The above and still further objects, features and advantages of the present invention will become apparent upon consideration of the following detailed description of specific embodiments thereof, particularly when taken in conjunction with the accompanying drawings wherein like reference numerals in the various figures are utilized to designate like components.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a view in perspective of a firearm laser training system with a laser beam directed from a firearm onto an intended target according to the present invention.

FIG. 2 is an exploded view in perspective and partial section of a laser transmitter assembly of the system of FIG. 1 fastened to a firearm barrel.

FIG. 3 is a schematic block diagram of an image sensing device according to the present invention.

FIG. 4 is a procedural flow chart illustrating the manner in which the image sensing device is calibrated according to the present invention.

FIGS. 5-6 are schematic illustrations of exemplary graphical user screens displayed by the system of FIG. 1 for calibrating the image sensing device.

FIG. 7A is a side view in elevation and partial section of a target including an image sensing device to detect laser beam impact locations according to the present invention.

FIG. 7B is a front view in elevation of a target including a plurality of image sensing devices each associated with a target section to detect laser beam impact locations according to the present invention.

FIG. 8 is a procedural flow chart illustrating the manner in which the system of FIG. processes and displays laser beam impact locations according to the present invention.

FIG. 9 is a schematic illustration of an exemplary graphical user screen displayed by the system of FIG. 1 during system operation.

FIG. 10 is a schematic illustration of an alternative exemplary graphical user screen displayed by the system of FIG. 1 during system operation.

FIG. 11 is a schematic illustration of the exemplary graphical user screen of FIG. 10 during operation of the system in a trace mode.

FIG. 12 is a schematic illustration of the exemplary graphical user screen of FIG. 10 during operation of the system in a plot mode.

FIG. 13 is a schematic illustration of an exemplary graphical user screen displayed by the system of FIG. 1 during operation of the system for a shotgun course of fire.

FIG. 14 is a schematic illustration of an exemplary graphical user screen displayed by the system of FIG. 1 during operation of the system for a course of fire.

FIGS. 15-17 are schematic illustrations of exemplary graphical user screens displayed by the system of FIG. 1 during operation of the system with various targets.

FIG. 18 is a schematic illustration of an exemplary graphical user screen displayed by the system of FIG. 1 during operation of the system with a target in the form of a video segment.

FIG. 19 is a schematic illustration of an exemplary graphical user screen displayed by the system of FIG. 1 during operation of the system with a simulation of an indoor firing range.

FIG. 20 is a schematic illustration of an exemplary graphical user screen displayed by the system of FIG. 1 during operation of the system with a simulation of a live firing range.

FIG. 21 is a schematic illustration of an exemplary graphical user screen displayed by the system of FIG. 1 during operation of the system with a simulation of a skeet shoot session.

DETAILED DESCRIPTION OF THE INVENTION

A firearm laser training system that accommodates various types of targets according to the present invention is illustrated in FIG. 1. Specifically, the firearm laser training system includes a laser transmitter assembly 200, a target 10, an image sensing device 16 and a computer system 18. The laser assembly is attached to an unloaded user firearm 76 to adapt the firearm for compatibility with the training system. By way of example only, firearm 76 is implemented by a conventional hand-gun and includes a trigger 77, a barrel 78, a hammer 79 and a grip 85. However, the firearm may be implemented by any conventional firearms (e.g., hand-gun, rifle, shotgun, etc.), while the laser and firearm combination may be implemented by any of the simulated firearms disclosed in the above-mentioned patents and patent application publications. Laser assembly 200 emits a beam 11 of visible laser light in response to actuation of trigger 77. The laser assembly is configured for insertion within barrel 78 to fasten the laser assembly to the barrel as described below.

A user aims unloaded firearm 76 at target 10 and actuates trigger 77 to project laser beam 11 from laser assembly 200 toward the target. Sensing device 16 detects the laser beam impact location on the target and provides location information to computer system 18. Target 10 may be in the form of a paper target, a television or computer monitor, or an LCD or other panel or display displaying images serving as a target. The computer system processes the location information and displays simulated projectile impact locations on the target via a graphical user screen (e.g., FIGS. 9-21) as described below. The system may further include a projector 17 to provide targets in the form of projected images 14. Sensing device 16 detects the laser beam impact location on the projected image and provides location information to computer system 18 as described above. The computer system processes the location information and displays simulated projectile impact locations on the projected image via a graphical user screen (e.g., FIGS. 15-21) as described below. In addition, the computer system may determine scoring and other information based upon the performance of a user.

The system may be utilized with various types of targets to facilitate firearm training and/or qualifications (e.g., certification to a particular level or to use a particular firearm). The system may additionally be utilized for entertainment purposes (e.g., in target shooting games or sporting competitions). By way of example only, target 10 may be implemented by a two-dimensional target, preferably constructed of paper or other material, and attached to or suspended from a supporting structure, such as a wall. The target preferably includes indicia forming a transitional type target having a silhouette of a person with several sections or zones (e.g., typically between five and seven) defined therein. The target sections may each be assigned a value in order to determine a score for a user. The sections and values typically vary based on the system application and/or particular organization (e.g., military, law enforcement, firearm club, etc.) utilizing the system. Further, plural target sections (e.g., contiguous or non-contiguous) may be associated with a common value, while each section may be of any shape or size. The score is determined by accumulating the values of the target sections impacted by the laser beam during the firearm activity. The values of the target sections may further be multiplied by a scoring factor set by the system and/or the user to accommodate various scoring schemes utilized by different organizations. Alternatively, the target may include an image produced by a projector, a television or computer screen and/or a an LCD or other display panel, where these targets may similarly include sections or zones each assigned a corresponding value as described above.

The computer system receives the beam impact locations from the sensing device and retrieves the section values corresponding to the impact locations as described below. Section values for each beam impact are accumulated to produce a score for a user. The target may be of any shape or size, may be constructed of any suitable materials and may include any indicia to provide any type of target for facilitating any type of training, qualification, gaming, entertainment or other activity. Moreover, the system may utilize any conventional, simulated or “dry fire” type firearms (e.g., hand-gun, rifle, shotgun, firearms powered by air/carbon dioxide, etc.), or firearms utilizing blank cartridges such as those disclosed in the above-mentioned patents and patent application publications, for projecting a laser beam to provide full realism in a safe environment.

An exemplary laser transmitter assembly employed by the training system is illustrated in FIG. 2. Specifically, laser transmitter assembly 200 includes a barrel sleeve 202, a power source 210, typically in the form of batteries, a laser module 211 and a sleeve cap 216. The barrel sleeve includes a generally cylindrical barrel member 204 and a threaded stop 206 disposed at the barrel member distal end. The transverse cross-sectional dimensions of the barrel member are slightly less than those of barrel 78 to enable the barrel member to be inserted within the barrel. The barrel member includes an adjustment member 203 disposed at the barrel member proximal end. The adjustment member is typically in the form of a screw and adjusts the barrel member dimensions in response to manipulation by a user. The adjustment member alters the barrel member dimensions to enable the barrel member to frictionally engage the barrel and provide a snug fit. Stop 206 is in the form of a substantially annular ring and has dimensions slightly greater than those of the barrel member and barrel to limit insertion of the sleeve within the barrel. The stop outer surface includes threads facilitating engagement with sleeve cap 216 as described below. Power source 210 has dimensions sufficient for insertion within the barrel sleeve to provide power to laser module 211.

Laser module 211 includes a substantially cylindrical housing 212 including therein a mechanical wave sensor (not shown) and an optics package (not shown) including a laser and a lens. These components may be arranged within the housing in any suitable fashion. The optics package emits a laser beam through the lens toward an intended target in response to detection of hammer fall by the mechanical wave sensor. Specifically, when trigger 77 is actuated, hammer 79 impacts the firearm frame. The impact generates a mechanical wave which travels distally along barrel 78 toward barrel member 202. As used herein, the term “mechanical wave” or “shock wave” refers to an impulse (e.g., acoustic wave, shock wave, vibration along the barrel, etc.) traveling through the barrel. The mechanical wave sensor within the laser module senses the mechanical wave and generates a trigger signal. The mechanical wave sensor may include a piezoelectric element, an accelerometer or a solid state sensor, such as a strain gauge. The optics package within the laser module generates and projects a laser beam from firearm 76 in response to the trigger signal. The optics package laser is generally enabled for a predetermined time interval sufficient for image sensing device 16 to detect a beam impact. The beam may be modulated, coded or pulsed in any desired fashion. Alternatively, the laser module may include an acoustic sensor to sense actuation of the trigger and enable the optics package. The laser module is similar in function to the laser devices disclosed in the aforementioned patents and patent application publications. The laser assembly may be constructed of any suitable materials and may be fastened to firearm 76 at any suitable locations by any conventional or other fastening techniques.

The optics package of laser module 211 is generally disposed toward the housing distal end, while the mechanical wave sensor is typically disposed toward the housing proximal end to detect the mechanical wave. Sleeve cap 216 is substantially cylindrical having an open proximal end and a closed distal end. The cap includes threads disposed on its interior surface to engage threads of stop 206. The closed distal end of the cap includes a substantially central opening 220 defined therein to enable a laser beam emitted by the laser module to pass through the cap.

In operation, power source 210 and laser module 211 are inserted into barrel sleeve 202. Sleeve cap 216 is subsequently attached to stop 206 via their respective threaded portions. The barrel sleeve is inserted into the barrel, preferably until stop 206 contacts the barrel distal end. The barrel sleeve dimensions may be selectively adjusted by manipulation of the adjustment member as described above to provide a secure fit. Laser transmitter assembly 200 basically emits a laser beam from laser module 211 through opening 220 of cap 216 in response to firearm actuation as described above.

Referring back to FIG. 1, computer system 18 is coupled to and receives and processes information from sensing device 16 to provide various feedback to a user. The computer system is typically implemented by a conventional IBM-compatible laptop or other -type of personal computer (e.g., notebook, desk top, mini-tower, Apple Macintosh, palm pilot, etc.) preferably equipped with display or monitor 34, a base 32 (e.g., including the processor, memories, and internal or external communication devices or modems) and a keyboard 36 (e.g., in combination with a mouse or other input devices). Computer system 18 includes software to enable the computer system to communicate with sensing device 16 and provide feedback to the user. The computer system may utilize any of the major platforms (e.g., Linux, Macintosh, Unix, OS2, etc.), but preferably includes a Windows environment (e.g., Windows 95, 98, NT, 2000, XP, etc.). Further, the computer system includes components (e.g. processor, disk storage or hard drive, etc.) having sufficient processing and storage capabilities to effectively execute the system software.

Computer system 18 is connected to sensing device 16 via an Ethernet type connection. The sensing device may be mounted on a tripod and positioned at a suitable location from the target. However, any type of mounting or other structure may be utilized to support the sensing device. The sensing device detects the location of beam impacts on the target (e.g., by capturing an image of the target and detecting the location of the beam impact from the captured image) and provides impact location information in the form of X and Y coordinates to computer system 18. The sensing device performs a calibration to correlate the captured image space with the target space and/or display space as described below. A printer (not shown) may further be connected to the computer system to print reports containing user feedback information (e.g., score, hit/miss information, etc.).

The system may be utilized with various types of targets. Target characteristics are contained in several files that are stored by computer system 18. In particular, a desired target may be photographed and/or scanned prior to system utilization to produce several target files and target information. Alternatively, images of user generated targets may be captured via sensing device 16 and optionally manipulated to form a target image, while computer system 18 or other computer system (e.g., via the training system or conventional software) may be utilized to produce the target files and target information for use by the system. A target file includes a parameter file, a display image file, a scoring image file and a print image file. The parameter file includes information to enable the computer system to control system operation. By way of example only, the parameter file includes the filenames of the display, scoring and print image files, a scoring factor and cursor information (e.g., grouping criteria, such as circular shot group size). The display and print image files include an image of the target scaled to particular sections of the monitor and report containing that image, respectively. Indicia, preferably in the form of substantially circular icons, are overlaid on these images to indicate beam impact locations (e.g., FIGS. 9-10 and 14), and typically include an identifier to indicate the particular shot (e.g., the position number of the shot within a shot sequence). The dimensions of the indicia may be adjusted to simulate different ammunition or firearm calibers entered by a user.

The scoring image file includes a scaled scoring image of the target having scoring sections or zones shaded with different colors. Any variation of colors may be utilized, and the colors are each associated with corresponding information associated with that zone. The zone information typically includes scoring values, but may include any other types of activity information (e.g., target number, desirable/undesirable hit location, priority of hit location, friend/foe, etc.). When impact location information is received from the sensing device, computer system 18 utilizes that information to access a corresponding location within the scoring image. The sensing device may correlate the captured image space directly to the display, scoring and/or print image files. In this case, the received coordinates may be used to directly access the corresponding locations in these files. Alternatively, the computer system may perform a translation on the received coordinates to access the locations within the display, scoring and/or print image files corresponding to the detected beam impact locations.

The color associated with the image location identified by the location information indicates a corresponding zone and/or scoring value. In effect, the colored scoring image functions as a look-up table to provide a zone value based on the location within the scoring image pertaining to a particular beam impact location. The scoring value of an impact location may be multiplied by a scoring factor within the parameter file to provide scores compatible with various organizations and/or scoring schemes. Thus, the scoring of the system may be adjusted by modifying the scoring factor within the parameter file and/or the scoring zones on the scoring image within the scoring image file. Alternatively, when other activity information is associated with the zones, the scoring image file may indicate occurrence of various events (e.g., hit/miss of target locations, target sections impacted based on priority, hit friend or foe, etc.) in substantially the same manner described above.

In addition, the target files typically include a second display file containing a scaled image of the target. The dimensions of this image are substantially greater than those of the image contained in the initial display image file, and the second display file is preferably utilized to display a target having plural independent target sites. The target files along with scaling and other information (e.g., target range information input by user) are stored on computer system 18 for use during system operation. Thus, the system may readily accommodate any type of target without interchanging system components. Moreover, target files may be downloaded from a network, such as the Internet, and loaded into the computer system to enable the system to access and be utilized with additional targets

Image sensing device 16 detects laser beam impact locations on the target and provides the X/Y or Cartesian coordinates of the impact location to computer system 18 for processing. The image sensing device is illustrated in FIG. 3. Specifically, image sensing device 16 includes an image sensor 102, an image buffer 104, memories 106, 108 and 110, a Field Programmable Gate Array (FPGA) 112 controlling sensing device functions, an Ethernet controller 118 providing Ethernet connections and a power supply 126 providing power for the sensing device. Image sensor 102 captures images of an intended target and employs charge-coupled devices (CCD) or CMOS. The image sensor repeatedly captures an image of the target, preferably through a lens 131, and provides target image information to FPGA 112 for processing. The FPGA may store image information in image buffer 104.

Image sensor 102 preferably provides image data in monochromatic form (e.g., gray scale or black and white, where red (R), blue (B) and green (G) pixel values are substantially the same); however, the image sensor may alternatively provide full color images (e.g., including red (R), blue (B) and green (G) pixel values). The sensing device may further employ a band pass type filter 129 to enable light from lens 131 having the wavelength of the laser transmitter (e.g., approximately 650 nanometers) to pass to the image sensor, while suppressing the ambient light. The lens initially focuses and provides ambient light in parallel rays for filtering by filter 129.

The image characteristics of the image sensor enable the device to capture images of the target including any changes to the target (e.g., beam impacts) occurring between successive frame transmissions. Thus, the sensing device facilitates detection of beam impacts from laser transmitters with pulse durations less than the frame rate of the image sensor (e.g., pulse durations as low as approximately 1.5 millisecond). Since the sensing device can accommodate low pulse durations, the sensing device may be universally employed with various laser transmitters.

FPGA 112 controls operation of sensing device 16 and is coupled to image sensor 102, image buffer 104, memories 106, 108 and 110, a system clock 114, a reset switch 116, Ethernet controller 118 and I/O ports 124, 128. The FPGA includes a combination of hardware and software to perform image processing and control of sensing device operation. In particular, the FPGA performs sensor timing control, exposure control, impact detection, memory control and system control functions. The sensor timing control relates to control of the image sensor for capturing images and transferring image information to FPGA 112 for processing. Exposure control relates to an electronic shutter function or, in other words, controlling the amount of time the image sensor is exposed to ambient light, while impact detection refers to determining impacts on the target from the captured image information as described below. The memory and system control functions relate to controls for the memories and overall operation of the sensing device.

System clock 114 provides a clock signal for the control functions, while reset switch 116 enables a user to reset or reboot the sensing device. I/O ports 124, 128 enable the sensing device to control external devices (e.g., laser transmitter, audio and visual indicators, etc.) to simulate return fire and/or provide audio and/or visual indications to a user.

In addition, the FPGA may perform image decimation and hit synchronization functions. Image decimation relates to generation of lower resolution images (e.g., selection of a portion of pixels from an image) to enhance transfer rates over a network. Hit synchronization is employed when sensing device 16 is utilized with or includes a signature detector 127 to detect signatures within an emitted laser beam. Each laser beam may include a signature to identify the particular user transmitting that beam. The signatures may be embedded within the laser beam in any desired fashion (e.g., modulation, pulse widths, etc.), while the signature detector may be implemented by any conventional or other devices to detect patterns or signatures within the transmitted laser signal. The FPGA determines the location of beam impacts as described below, while signature detector 127 determines the user providing the beam impact, thereby validating the beam impact as being from an authorized user (e.g., as opposed to registering a beam impact detection from ambient light). The signature detector further enables detection of beam impacts in the presence of significant levels of ambient light (e.g., sunlight) since an impact is registered in response to detection of the signature. In effect, the signature detector serves a filter for detected beam impacts to eliminate or minimize false detections due to ambient light. In the case where signature detector 127 is employed, the FPGA coordinates detection of impact locations with verification of laser beam signatures by the signature detector to associate impacts with particular or verified users.

Memory 106 is preferably implemented by a flash type memory and stores software for FPGA 112 to perform image processing and/or control functions. Memory 108 is preferably implemented by a static dynamic random access memory (SDRAM) and stores various settings for calibration and other functions, while memory 110 is preferably implemented by a flash type memory and stores calibration data. Thus, when the sensing device is calibrated as described below, the resulting data is stored in sensing device 16. Since memories 106, 108 and 110 are each non-volatile, the sensing device may be continually re-used without a re-calibration (e.g., unless the position of the sensing device or target has changed).

Sensing device 16 communicates with computer system 18 via an Ethernet or other network type connection. Ethernet controller 118 is coupled to FPGA 112 and to an Ethernet interface 120. The Ethernet controller controls communications with the computer system and is coupled to an Ethernet connector 122 via Ethernet interface 120. These components may be implemented by any conventional or other network components to provide communications with the computer system via any suitable protocols (e.g., Ethernet, etc.).

Sensing device 16 may process images from various viewing angles and orientations and compensates for deformations in the captured image due to the device position and/or lenses employed over the image sensor (e.g., fisheye, etc.). In order to compensate for the deformations, a calibration is performed as illustrated in FIGS. 4-6. Initially, the sensing device is coupled to the computer system with the computer system being initialized for communications (e.g., assigning a network address to the sensing device, projector, etc.) at step 150. A target (e.g., paper, projected image, television or computer screen, LCD panel, etc.) is provided at step 152, and a calibration icon on the computer system display is actuated. In the case of a projected image, the projector is initialized to provide the image at a desired location and with a suitable alignment, size and/or form. The computer system verifies connection of the sensing device and subsequently displays a calibration graphical user screen 130 (FIGS. 5-6) at step 154.

Screen 130 includes a viewing area 132, instruction areas 133, a gain slide 136, and buttons 137, 138 that respectively provide help information and enable continuation of the calibration. Instruction areas 133 provide instructions to a user, while viewing area 132 displays an image of the target from sensing device 16 and a grid 134 to identify image boundaries for the calibration. The grid is substantially rectangular and includes a series of labeled points along the grid perimeter. By way of example only, the grid includes eight labeled points 1-8 identifying locations of the target image to the sensing device. Four points are located at respective corners of the grid, while the remaining four points are located at the approximate midpoint between the grid corners along each of the grid longer and shorter dimensioned edges. An additional point for the calibration is located at the center of the grid. These grid points correspond to particular locations (e.g., top left corner, top middle, top right corner, etc.) or coordinates within the target or an ideal target image. For example, point 1 may correspond to pixel coordinates 0,0 within an ideal image. These ideal image points are used in the calibration as described below.

Initially, gain slide 136 is manipulated to enable the target image captured by the sensing device to be visible in viewing area 132 at step 156. The sensing device position and/or lens may further be adjusted to enable the target to fill the viewing area and/or to adjust the focus. Once the target is viewable in viewing area 132, the user manipulates each perimeter point on the grid in sequence to coincide with the corresponding location on the target within the captured image in viewing area 132 at step 158 (FIG. 6). The grid center point is subsequently manipulated to coincide with the target center in the captured image. These points indicated by the user identify points on the distorted image corresponding to the grid points and are utilized in the calibration as described below. If the target in the captured image is rotated or otherwise displaced by the viewing angle and/or position of the sensing device, the user indication of the grid points in the captured image in combination with the corresponding locations of the grid points in an ideal or undistorted image enable the system to detect and compensate for the displacement as described below.

Once the grid is shifted to the image, the sensing device receives the target image boundaries from the computer system and the FPGA processes the target image to determine a translation matrix at step 160 that compensates for the angle and/or orientation of the sensing device and distortion produced by lens 131 within the captured image. The translation matrix maps each image pixel to a corresponding coordinate in the target and/or display spaces (e.g., computer system display, etc.). Initially, the primary causes of errors with respect to detection of beam impacts include radial distortion caused by the sensing device lens including a magnification at the center of the image slightly greater than the magnification at the edges, and the perspective distortion caused by the sensing device position and angle relative to the target. The calibration technique produces the translation matrix to compensate for the position and angle of the sensing device and to correct the distortions caused by the lens.

The user manipulates the grid to identify each of the points on the distorted image as described above. The real or actual coordinates of the calibration points (e.g., the coordinates of the grid points in an ideal image) and the coordinates on the distorted image (e.g., indicated by the user) are used to calculate distortion parameters. The radial and perspective distortions of the image may be modeled by conventional techniques, such as those disclosed in R. Y. Tsai, “A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-The-Shelf TV Cameras and Lenses”, IEEE Journal of Robotics and Automation, Vol. RA-3, No. 4. August 1987, the disclosure of which is incorporated herein by reference in its entirety.

The position and orientation of the sensing device relative to the target and radial distortion parameters are estimated using the known locations (e.g., the points indicated by the user on the distorted image and the locations of the grid points in an ideal image). This estimation may be determined according to conventional techniques, such as those disclosed in D. G. Bailey, “A New Approach to Lens Distortion Correction”, Proceedings of Image and Vision Computing, New Zealand, pp. 59-64, November 2002, the disclosure of which is incorporated herein by reference in its entirety.

The perspective distortion is corrected by using a projective transformation. Subsequently, the lens distortion is corrected using the determined radial distortion parameters. The coordinates obtained from this modeling are compared with corresponding coordinates measured from the image to provide an error term. The distortion parameters are adjusted to minimize this error. This may be accomplished in accordance with conventional techniques, such as those disclosed in G. Vass et al., “Applying and Removing Lens Distortion in Post Production”, The Second Hungarian Conference on Computer Graphics and Geometry, 2003, the disclosure of which is incorporated herein by reference in its entirety. The resulting coordinates with minimal error are placed within the translation matrix.

Once the FPGA detects a beam impact within a captured image, the corresponding coordinates from the translation matrix translate the impact to target and/or display spaces and are provided to the computer system for processing. The translation matrix is stored in the sensing device for future use (without a re-calibration). The sensing device (or translation matrix) may correlate the captured image space directly to the display, scoring and/or print image files as described above. In this case, the coordinates provided by the sensing device may be used to directly access the corresponding locations in these files. Alternatively, the computer system may perform a translation on the coordinates received from the sensing device to access the locations within the display, scoring and/or print image files corresponding to the detected beam impact locations. Further, the FPGA may perform the calibration with respect to actual pixel coordinates, or with respect to distances from reference points (e.g., where the quantity of pixels per millimeter or other length unit may be utilized).

In addition, the sensing device further performs a light sensitivity calibration to adjust for ambient light conditions at step 162. The user may select high, medium or low sensitivity. The FPGA processes the target image pixel intensities (e.g., red (R), green (G) and blue (B) pixel values) to measure the ambient light and determines the thresholds described below for determining beam impact detections. Basically, the FPGA determines a threshold value for each of a plurality of regions within the target image. By way of example, the present invention utilizes forty-eight regions; however, any quantity of regions at any locations within the image may be utilized. The threshold value for each region is determined from the sum of a maximum luminance or pixel intensity value of pixels within that region and an offset. The offset is based on a desired light sensitivity indicated by a user as described above. Since target images are being repeatedly captured and transmitted to the FPGA, certain captured target images may not contain any beam impact detections. Accordingly, the thresholds for each region basically control the system sensitivity to the emitted beam in relation to the ambient light, and enables the system to determine the presence of a beam impact within a corresponding region of the captured target image.

Once the calibrations are complete, the resulting information is stored in sensing device memory 106 at step 164 to enable the sensing device to be re-used without performing the calibration (e.g., the calibration is performed in response to a change in position of the target and/or sensing device).

Sensing device 16 may be utilized within various targets to detect laser beam impacts on those targets as illustrated in FIG. 7A. By way of example, a target 15 may include a housing 23, sensing device 16 and a diffuser 19. The housing may be of any shape or size, and may be in the form of any suitable target to simulate a scenario (e.g., silhouette, bull's eye, standard targets for military or law enforcement applications, etc.). For example, the target may be of the types disclosed in the aforementioned U.S. Pat. No. 6,322,365 (Shechter et al.) and U.S. Patent Application Publication No. 2005/0153262 (Kendir). The sensing device is calibrated as described above with diffuser 19 serving as a target surface and disposed in housing 23 at any suitable location. The housing includes an opening 25 with diffuser 19 disposed over or within that opening to serve as an impact location for target 15. Sensing device 16 captures target images and determines beam impact locations in substantially the same manner described above to provide beam impact coordinates to a computer system via a wired or wireless network connection.

Since the information provided by the sensing device is a relatively small amount, several targets or sensing devices may be coupled to the same computer system without significantly affecting network traffic or bandwidth. Accordingly, a large target may include several sensing devices as illustrated in FIG. 7B. By way of example, target 15 may include a plurality of sensing devices 16 each associated with a corresponding target section and coupled to a network switch 27, preferably employing an Ethernet protocol. The network switch may be implemented by any conventional or other network device and may utilize any suitable communications protocol. Target housing 23 may include a single large diffuser 19 serving as a housing side to receive beam impacts. Alternatively, a housing side may include a plurality of openings 25 with a corresponding diffuser 19 disposed over or within each opening. Each sensing device is calibrated and placed in the target housing to detect beam impacts within a corresponding section or diffuser as described above. The sensing devices transfer coordinates from detected beam impacts to network switch 27 for transference to a computer system for processing as described above.

Computer system 18 includes software to control system operation and provide graphical user interfaces for displaying user performance. The manner in which the computer system monitors beam impact locations and provides information to a user is illustrated in FIGS. 8-9. Initially, computer system 18 (FIG. 1) facilitates performance of calibrations for the sensing device and ambient light conditions as described above at step 40.

Once the calibrations are completed, a user may commence projecting the laser beam from the firearm toward a target (e.g., paper, projected image, television or computer screen, LCD panel, etc.). Sensing device 16 captures target images at step 42 and processes the captured target images, via FPGA 112, to determine a beam impact location at step 44. Specifically, each captured target image received by the sensing device includes a plurality of pixels each associated with red (R), green (G) and blue (B) values to indicate the color and luminance of that pixel. In the case of gray scale (e.g., black and white) images, the values for red (R), green (G) and blue (B) are substantially the same. The red, green and blue values for each pixel are multiplied by a respective weighting factor and summed to produce a pixel density. In other words, the pixel density may be expressed as follows:
Pixel Density=(R×Weight1)+(G×Weight2)+(B×Weight3)
where Weight1, Weight2 and Weight3 are weighting values that may be selected in any fashion to enable the system to identify beam impact locations within the captured target images. The respective weights may have the same or different values and may be any types of values (e.g., integer, real, etc.).

The beam impact location is considered to occur within a group of pixels within a captured image where each group member has a density value exceeding a threshold. This threshold is determined from the light sensitivity calibration described above and corresponds to the region containing the beam impact. Typically, the group of pixels containing or representing the beam impact form an area or shape. The pixel at the center of the area or shape formed by the pixel group is considered by the system to contain, or represent the location of, a beam impact. If the density value of each captured image pixel is less than the threshold, the captured target image is not considered to include a beam impact. The threshold basically controls the system sensitivity to the emitted beam in relation to the ambient light, and enables the system to determine the presence of a beam impact within a captured target image. When the computer system identifies a pixel containing a beam impact, the coordinates (e.g., X and Y coordinates) of that pixel within the captured target image are translated by the translation matrix to coordinates within the target space and/or display space (e.g., computer display, etc.) and provided to the computer system for processing at step 46.

The computer system includes several target files having target information and scaled images as described above. The coordinates received from the sensing device enable display or overlay of the impact location on the target files. In addition, the sensing device may determine the pulse width of the laser beam (e.g., from detection of beam impacts within successive frames), and provide information to the computer system to enable display of messages in response to a user utilizing a laser having an unsuitable pulse width with respect to the system configuration. Further, the detection of laser pulse widths may be utilized to identify and associate beam impacts with particular users employing laser transmitters with different pulse widths. The system preferably is configured for laser transmitters emitting a pulse having a duration of 1.5 milliseconds, but may be utilized and/or configured for operation with laser transmitters having any desired pulse width.

The received coordinates are utilized to access a corresponding location in the scoring image and determine the score or other activity information for the beam impact at step 50. Specifically, the received coordinates are utilized to identify a corresponding location within the scoring image. Various sections of the scoring image are color coded to indicate a value or other activity information associated with that section as described above. The color of the identified location within the scoring image is ascertained to indicate the value or other activity information for the beam impact. The scoring factor within the parameter file is applied to (e.g., multiplied by) the score value to determine a score for the beam impact. The score and other impact information is determined and stored in a database or other storage structure, while a computer system display showing the target is updated to illustrate the beam impact location and other information (e.g., natural dispersion, center of mass, score, score percentage, elapsed time, qualification, etc.) at step 52.

The display image is displayed, while the beam impact location is identified by indicia that are overlaid with the display image and placed in an area encompassing the received coordinates. The indicia may be scaled to reflect the caliber of the firearm. In addition, the computer system may provide audio (e.g., resembling firearm shots and/or hits) to indicate beam impact. Exemplary graphical user screens indicating the target, beam impact locations, impact time, score and other information is illustrated in FIGS. 9-10 and 14. The system may be configured to detect, process and display beam impacts: from any desired shooting rate (e.g., machine gun rates of approximately one thousand rounds per minute, etc.); originating from blank fire (e.g., as disclosed in the aforementioned U.S. Pat. No. 6,322,365); and/or for targets at maximum distances of approximately twenty-five meters.

If a round or session of firearm activity is not complete as determined at step 54, the user continues actuation of the firearm and the system detects beam impact locations and determines information as described above. However, when a round or session is determined to be complete at step 54, the computer system retrieves information from the database and determines information pertaining to the round at step 56. The computer system may further determine grouping circles. These are generally utilized on shooting ranges where projectile impacts through a target must all be within a circle of a particular diameter (e.g., four centimeters). The computer system may analyze the beam impact information and provide groupings and other information on the display that is typically obtained during activities performed on firing ranges (e.g., dispersion, etc.). The grouping circle and beam impact location indicia are typically overlaid with the display image and placed in areas encompassing the appropriate coordinates of the display image space in substantially the same manner described above.

When a report is desired as determined at step 58, the computer system retrieves the appropriate information from the database and generates a report for printing at step 60. The report includes the print image, while beam impact location coordinates are retrieved from the database for the report. The beam impact locations are identified by indicia that are overlaid with the print image and placed in an area encompassing the corresponding location of the beam impact as described above for the display. The report may further include various information pertaining to user performance (e.g., score, dispersion, center of mass, impact score, cumulative score, score percentage, elapsed time, time between shots, etc.), and may alternatively be generated in electronic form in any desired format (e.g., .doc, .pdf, .wpd, etc.). When another round is desired, and a calibration is requested at step 64, the computer system facilitates calibration of the sensing device at step 40 and the above process of system operation is repeated. Similarly, the above process of system operation is repeated from step 42 when another round is desired without performing a calibration. System operation terminates upon completion of the training or qualification activity as determined at step 62.

An exemplary user screen employed by the system for a silhouette type target (e.g., a military M9 type target) is illustrated in FIG. 9. Specifically, screen 170 includes a target area 172, an action bar 174, an information area 176, a shot table area 178 and a mode selection area 179. Target area 172 includes an image of the target with beam impact locations indicated thereon as described above. Action bar 174 includes a series of icons to: indicate on-line/off-line status; assign a shooter name and course of fire to the target; add or change overlays; print a copy of the session; access options (e.g., target timeout, target auto start, caliber size, shot limit, enable/disable sound, etc.); start/stop a session (e.g., toggle between on-line and off-line modes); remove a shot from the shot table; open a previously saved session; save a session; and perform network and/or instructor functions.

Information area 176 provides various information to a user (e.g., date, time, user name, user identification, course, session elapsed time, total number of hits, score, score percentage, dispersion, center of mass, qualification, etc.). Shot table area 178 includes a shot table providing the hit number, time and score for each detected target hit. Mode selection area 179 includes radio buttons to enable a user to select the mode of operation. The modes include shot mode to record shot locations on the screen, trace mode to track a laser beam on the target as described below and plot mode to plot a path of laser beam impacts as described below. Screen 170 may further enable a user to display menus for selecting options, reports and targets or overlays. These menus are typically displayed in response to actuation of corresponding icons in the action bar.

The system may additionally provide trace and plot features as described above. Referring to FIGS. 10-12, screen 180 is similar to screen 170 described above and includes target area 172, action bar 174, information area 176, shot table area 178 and mode selection area 179. Target area 172 includes a target image in the form of a bull's eye. Action bar 174 includes a series of icons to: assign a shooter name and course of fire to the target; add or change overlays; print a copy of the session; access options (e.g., target timeout, target auto start, caliber size, shot limit, enable/disable sound, etc.); remove a shot from the shot table; open a previously saved session; save a session; and stop a session.

Information area 176 provides various information to a user (e.g., date, time, user name, user identification, course, session elapsed time, total number of hits, score, score percentage, dispersion, center of mass, etc.). Shot table area 178 includes a shot table providing the hit number, time and score for each detected target hit. Mode selection area 179 includes radio buttons to enable a user to select the mode of operation. The modes include shot mode to record shot locations on the screen, trace mode to track a laser beam on the target as described below and plot mode to plot a path of laser beam impacts as described below.

The trace mode is enabled in response to a user selecting the trace mode in mode selection area 179 and the laser transmitter assembly operating in a “constant on” mode. The computer system displays a flashing block 171 on the graphical user screen (FIG. 11). The block follows movement of the firearm or laser beam projected on the target. Basically, the computer system receives coordinates of laser beam impact locations from the sensing device and utilizes those coordinates to display the block. The position of the block is adjusted on the display in accordance with the received coordinates. As the firearm or laser beam alters position, the block is similarly adjusted on the display to visually indicate movement of the firearm.

With respect to plot mode, the system traces the aiming position of the firearm or laser transmitter assembly and reports graphically the horizontal and vertical deviations of the firearm (FIG. 12). In this mode, the laser transmitter assembly is configured to continuously project a laser beam from the firearm (e.g., “constant on” mode) as described above. The continuous laser beam projection allows the sensing device to trace any movement of the firearm, which in turn, allows the computer system to provide feedback to the user relating to fluctuation in firearm aim. The computer system continuously receives detection information (e.g., target image coordinates indicating beam impact locations) from the sensing device. Since the laser transmitter assembly is in a continuous mode (e.g., continuously projecting a laser beam onto the target), the sensing device traces the aim of the firearm on the target and continuously relays detection information to the computer system.

The computer system determines the target impact locations as described above and compiles and displays a trace report to provide an indication to the user of the horizontal and vertical fluctuations of the firearm with respect to an actual and/or desired hit location on the target. FIG. 12 illustrates an exemplary graphical user screen displaying plot mode information and includes plots of horizontal and vertical fluctuations in firearm aim. The vertical and horizontal plots are typically color coded to identify a particular plot.

Operation of the system is described with reference to FIG. 1. Initially, a target (e.g., paper, projected image, television or computer screen, LCD or other display panel, etc.) is provided and sensing device 16 is connected to the computer system. Laser transmitter assembly 200 is inserted into barrel 78 of firearm 76 as described above. The laser module is actuated in response to depression of firearm trigger 77. The computer system is commanded to commence a firearm activity, and may initially control calibrations for sensing device 16, as necessary, in the manner described above. The firearm may be actuated by a user, while the sensing device captures images of the target and provides coordinates of beam impact locations to the computer system as described above. The computer system may determine a score value corresponding to the impacted target section and other information for storage in a database as described above. The impact location and other information are displayed on a graphical user screen (e.g., FIGS. 9-12) as described above. When a round is complete, the computer system retrieves the stored information and determines information pertaining to the round for display on the graphical user screen. Moreover, a report may be printed providing information relating to user performance as described above. In addition, the system may provide indicia on the display to indicate and trace firearm movement as described above.

The system may be employed with various targets and corresponding graphical user screens in substantially the same manner described above to detect and display beam impact locations on the various targets. The targets may include zones or sections for scoring or to indicate a particular firearm activity as described above. Alternatively, the targets may be for display purposes to indicate beam impact locations (without scoring). An exemplary target to simulate a shotgun emission is illustrated in FIG. 13. In this case, the system displays standard shotgun dispersion patterns (e.g., with nine pellets) on a target image from the center mass of a detected beam impact. Specifically, screen 190 is substantially similar to screen 170 described above and includes target area 172, action bar 174, information area 176, shot table area 178 and mode selection area 179. Target area 172 includes a target image in the form of a silhouette. Action bar 174 includes a series of icons to: indicate on-line/off-line status; assign a shooter name and course of fire to the target; add or change overlays; print a copy of the session; access options (e.g., target timeout, target auto start, caliber size, shot limit, enable/disable sound, etc.); start/stop a session (e.g., toggle between on-line and off-line modes); end a task (as described below); remove a shot from the shot table; open a previously saved session; save a session; and perform network functions. Information area 176 provides various information to a user (e.g., date, time, user name, user identification, time remaining, total number of hits, score, dispersion, center of mass, qualification, etc.).

The computer system receives beam impact coordinates from the sensing device and determines the dispersion pattern (e.g., additional beam impact locations from the beam impact coordinates) for shotgun pellets to display on the target in target area 172. This is preferably accomplished by applying a probability distribution to the received beam impact coordinates. The probability distribution is biased toward the center of the received beam impact location and produces pixel coordinates for the dispersion pattern. The options for this type of screen include a shot limit, auto start and a session time. These options may be selected in response to actuation of an options icon in action bar 174 as described above.

The system may provide qualification of users for various tasks. In this case, a series of tasks (or a course of fire) are performed by the user to determine a user qualification. The tasks are stored in a file that indicates the task order and the shot and time limit for each task. Shots are not registered by the system after expiration of the shot or time limit. Successive tasks may be automatically performed, or an instructor may need to indicate the start of a succeeding task after completion of a prior task. Referring to FIG. 14, screen 230 is similar to screen 170 described above and includes target area 172, action bar 174, information area 176, shot table area 178 and mode selection area 177. Target area 172 includes a target image in the form of a silhouette. Action bar 174 is similar to the action bar described above, and includes a series of icons to: indicate on-line/off-line status; assign a shooter name and course of fire to the target; add or change overlays; print a copy of the session; access options (e.g., target timeout, target auto start, caliber size, shot limit, enable/disable sound, etc.); start/stop a task; remove a shot from the shot table; open a previously saved session; save a session; and load a course of fire.

Information area 176 provides various information to a user (e.g., user name, user identification, task, task hits, total hits, task score, total score, dispersion, center of mass, qualification, time left, etc.). Shot table area 178 includes a shot table providing the hit number, time, score and task for each detected target hit. Mode selection area 177 includes radio buttons to enable a user to select the mode of operation and inputs for mode options (e.g., time between tasks, end task on shot limit, etc.). The modes include: instructor controlled qualification to enable an instructor to control the start of tasks; automatic qualification where an instructor or shooter starts a session and succeeding tasks start after a preset delay until completion of the tasks; free practice to enable a user to practice; and task selection to enable a user to select and perform a particular task. The mode options enable a user or instructor to enter the time between tasks and whether termination of a task occurs at the expiration of the shot limit or time limit. The computer system may determine a user qualification based on the performance of the tasks by the user and criteria for specific qualifications.

The system may alternatively be employed with various targets in the form of images or videos produced from a projector, displayed on a television screen or monitor, or displayed on an LCD or other panel in substantially the same manner described above to detect and display beam impact locations on those various targets (e.g., FIGS. 15-21). The targets may include zones or sections for scoring or to indicate a particular firearm activity. Alternatively, the targets may be for display purposes to indicate beam impact locations (without scoring). Exemplary targets in the form of projected or displayed images include: balloons or other objects that change in both size and shape while moving to improve skills with moving targets and judgment (FIG. 15); bowling pins or other stationary objects at varying distances (e.g., for each pin), size and exposure time (FIG. 16); and multiple stationary objects in a timed exercise (e.g., Epyx style plates) to increase speed and accuracy (FIG. 17). In addition, the projected or displayed images may simulate various shooting activities or ranges including: an indoor shooting range with an exemplary silhouette type target at various distances (FIG. 19); live fire courses with silhouette targets and recording hits and misses, where the target scenario may be edited by a user (FIG. 20); and skeet shooting with adjustable speed and difficulty levels (FIG. 21).

In addition, the system may utilized with videos of scenarios (FIG. 18). The system records shots during the session for playback of the beam impacts (e.g., indicated by the icon as viewed in FIG. 18) with the video for analysis of user performance. The system may pause the playback at each shot for analysis. Further, a user may load a custom video of a scenario for use with the system.

It will be appreciated that the embodiments described above and illustrated in the drawings represent only a few of the many ways of implementing a sensing device for a firearm laser training system and method of simulating firearm operation with various training scenarios.

The system may include any quantity or type of target of any shape or size, constructed of any suitable materials and placed in any desired location. The computer system may be implemented by any conventional or other computer or processing system. The components of the system may be connected by any communications devices (e.g., cables, wireless, network, etc.) in any desired fashion, and may utilize any type of conventional or other interface scheme or protocol. The computer system may be in communication with other training systems via any type of communications medium (e.g., direct line, telephone line/modem, network, etc.) to facilitate group training or competitions. The system may be configured for any types of training, qualification, competition, gaming and/or entertainment applications. The printer may be implemented by any conventional or other type of printer.

The firearm laser training system may be utilized with any type of firearm (e.g., hand-gun, rifle, shotgun, machine gun, etc.), while the laser module may be fastened to the firearm at any suitable locations via any conventional or other fastening techniques (e.g., frictional engagement with the barrel, brackets attaching the device to the firearm, etc.). Further, the system may include a dummy firearm projecting a laser beam, or replaceable firearm components (e.g., a barrel) having a laser device disposed therein for firearm training. The replaceable components (e.g., barrel) may further enable the laser module to be operative with a firearm utilizing any type of blank cartridges. The laser assembly may include the laser module and barrel member or any other fastening device. The laser module may emit any type of laser beam. The optics package may include any suitable lens for projecting the beam. The laser beam may be enabled for any desired duration sufficient to enable the sensing device to detect the beam. The laser module may be fastened to a firearm or other similar structure (e.g., a dummy, toy or simulated firearm) at any suitable locations (e.g., external or internal of a barrel) and be actuated by a trigger or any other device (e.g., power switch, firing pin, relay, etc.). Moreover, the laser module may be configured in the form of ammunition for insertion into a firearm firing or similar chamber and project a laser beam in response to trigger actuation. Alternatively, the laser module may be configured for direct insertion into the barrel. The laser module may include any type of sensor or detector (e.g., acoustic sensor, piezoelectric element, accelerometer, solid state sensors, strain gauge, etc.) to detect mechanical or acoustical waves or other conditions signifying trigger actuation. The laser module components may be arranged in any fashion, while the module power source may be implemented by any type of batteries. Alternatively, the module may include an adapter for receiving power from a common wall outlet jack or other power source. The laser beam may be visible or invisible (e.g., infrared), may be of any color or power level, may have a pulse of any desired duration and may be modulated in any fashion (e.g., at any desired frequency or unmodulated) or encoded in any manner to provide any desired information, while the transmitter may project the beam continuously or include a “constant on” mode. The system may be utilized with transmitters and detectors emitting any type of energy (e.g., light, infrared, etc.).

The target may be implemented by any type of target having any desired configuration and indicia forming any desired target site and be disposed at any suitable location. The target may include any desired still or moving images (e.g., still image, video, etc.) displayed on paper or other material, on a surface by a projector, on a television, computer, LCD panel or other form of display. The target may be of any shape or size, and may be constructed of any suitable materials. The target may include any conventional or other fastening devices to attach to any supporting structure. Similarly, the supporting structure may include any conventional or other fastening devices to secure a target to that structure. Alternatively, any type of adhesive may be utilized to secure a target to the structure. The support structure may be implemented by any structure suitable to support or suspend a target. The target may include any quantity of sections or zones of any shape or size and associated with any desired values. The target may include any quantity of individual targets or target sites. The system may utilize any type of coding, color or other scheme to associate values with target sections (e.g., table look-up, target location identifiers as keys into a database or other storage structure, etc.). Further, the sections or zones may be identified by any type of codes, such as alphanumeric characters, numerals, etc., that indicate a score value or any other information. The score values may be set to any desired values.

The target characteristics and images may be contained in any quantity of any types of files. The target images may be scaled in any desired fashion. The coordinate translations may be accomplished via any conventional or other techniques, and may be performed by the sensing device and/or computer system. The target files may contain any information pertaining to the target (e.g., filenames, images, scaling information, indicia size, etc.). The target files may be produced by the computer system or other processing system via any conventional or other software and placed on the computer system for operation. Alternatively, the target files may reside on another processing system accessible to the computer system via any conventional or other communications medium (e.g., network, modem/telephone line, etc.), or be available on any type of storage medium.

The system may be disposed in a case or other storage unit for transport, where the case may be of any size or shape and may be constructed of any suitable materials.

The sensing device may be of any shape or size, and may be constructed of any suitable materials. The sensing device components (e.g., image sensor, memories, FPGA, system clock, reset switch, buffer, I/O ports, power supply, etc.) may be implemented by any conventional or other components performing the functions described herein. The image sensor may be implemented by any conventional or other image sensor (e.g., camera, CCD, matrix or array of light sensing elements, etc.) suitable for detecting the laser beam and/or capturing a target image. The sensor may provide gray scale or full color images and include any desired frame rate suitable to detect beam impacts. The sensing device may employ any type of light sensing elements, and may utilize a grid or array of any suitable dimension. The filter may be implemented by any conventional or other filter having filtering properties for any particular frequency or range of frequencies. The lens may be implemented by any suitable lens to view the target.

The FPGA may be implemented by any suitable hardware and/or software modules to perform the functions described herein (e.g., processor, circuitry, logic, etc.). The memories and buffer may be implemented by any type of conventional or other memories or storage units (e.g., DRAM, Flash, buffers, volatile, non-volatile, etc.) and may store any desired information. Since the memories are preferably non-volatile, the sensing device may be continually re-used without a re-calibration even after losing power or a power down (e.g., unless the position of the sensing device or target has changed). This further enables the sensing device to be available with target calibration data pre-loaded into the device. For example, targets may be designed and/or calibrated for the sensing device during device manufacture and immediately used in the field. The memories may include sufficient storage capacity to store at least one translation matrix for one or more corresponding targets.

The sensing device may employ any suitable controllers, connectors and interfaces for communications and may utilize any desired communication protocols (e.g., Ethernet, etc.). The communications may be conducted via any communications medium (e.g., LAN, WAN, wired or cables, wireless, etc.). The I/O ports may be of any quantity, may be implemented by any type of conventional or other ports (e.g., IR, terminals/pins, etc.) and may provide any suitable I/O to transfer (e.g., transmit and/or receive) information with external devices (e.g., audio/visual indicators, laser devices, etc.). The system clock may provide a clock signal of any suitable frequency. The signature detector may be included within or coupled to the sensing device and may detect any suitable signature or pattern within any desired energy wave (e.g., laser, light, IR, etc.). The signature detector may be implemented by any conventional or other device detecting patterns within transmitted energy signals (e.g., circuitry, processor, etc.). The sensing device may be utilized (or the power supply may provide) any suitable power signals, but preferably in the range of 7V DC to 20V DC. The sensing device may receive power signals via unused pins of the network or Ethernet connector.

The sensing device may be supported by any mounting device (e.g., a tripod, a mounting post, etc.) and positioned at any suitable locations providing access to the target. The calibration may utilize any quantity of points on the grid to define the target area, and may map the area to any sized array. The grid locations may correspond to any suitable locations within the target confines. The sensing device may be positioned at any suitable location and at any desired viewing angle relative to a target. The sensing device may be coupled to any port of the computer system via any conventional or other device (e.g., cable, wireless, etc.). Alternatively, the sensing device may provide images to the computer system to determine beam impact locations. The sensing device may be configured to detect any energy medium having any modulation, pulse or frequency. Similarly, the laser may be implemented by a transmitter emitting any suitable energy wave. The sensing device may transmit any type of information to the computer system to indicate beam impact locations, while the computer system may process any type of information (e.g., X and Y coordinates, image information, etc.) from the sensing device to display and provide feedback information to the user.

The sensing device may be utilized in any type of target of any shape or size to detect beam impacts thereon. The target may include any quantity of sensing devices arranged in any fashion with each associated with a section of any shape or size to detect beam impacts on that section. The sensing devices may be coupled in any desired fashion to provide beam impact and other information (e.g., network, daisy chain, individual connections, wired connections, wireless connections, etc.). The targets may include any quantity of any suitable diffuser to enable detection of beam impacts by the sensing device.

It is to be understood that the software for the computer system and sensing device may be implemented in any desired computer language and could be developed by one of ordinary skill in the computer arts based on the functional descriptions contained in the specification and flow charts illustrated in the drawings. The computer system may alternatively be implemented by any type of hardware and/or other processing circuitry. The various functions of the computer system and sensing device may be distributed in any manner between these items and/or among any quantity of software modules, processing systems and/or circuitry. The software and/or algorithms described above and illustrated in the flow charts may be modified in any manner that accomplishes the functions described herein. The database may be implemented by any conventional or other database or storage structure (e.g., file, data structure, etc.).

The display screens and reports may be arranged in any fashion and may contain any type of information. The screens may include any quantity of any types of input mechanisms (e.g., fields, radio or other buttons, icons, etc.). The various parameter or other values may be displayed in the report and/or on the screens in any manner (e.g., charts, bars, etc.) and in any desired form (e.g., actual values, percentages, etc.), while any of the values displayed on the screens may be adjusted by the user via any desired input mechanisms. The calibration screen may include a grid of any shape, color or size to facilitate alignment of the sensing device with the target. The grid may include any quantity of points for a user to specify and may be associated with any locations of the target. The target may be defined within the captured target image in any desired manner via any suitable input mechanisms. The target may be defined at any suitable locations within the captured target image, while the selected locations may be indicated by any quantity of any types of indicia of any shape, color or size. The translation matrix may be determined by any conventional or other algorithms to compensate for viewing angle and/or correct deformations in the image. The calibration data may be stored in the computer system and/or sensing device.

The density value may be determined with any weights having any desired value or types of values (e.g., integer, real, etc.). The weights and pixel component values may be utilized in any desired combination to produce a pixel density. Alternatively, any quantity of pixel values within any quantity of images may be manipulated in any desired fashion (e.g., accumulated, averaged, multiplied by each other or weight values, etc.) to determine the presence and location of a beam impact within an image. Further, any quantity of density and/or pixel values within any quantity of images may be manipulated in any desired fashion (e.g., accumulated, averaged, multiplied by each other or weight values, etc.) to determine the threshold and light conditions. The threshold may be determined periodically, in response to any desired light or other conditions (e.g., light conditions are outside any desired range or have any desired change in value, etc.) or in response to a user, and may be set by the computer system and/or user to any desired value. The system may alternatively utilize gray scale or any type of color images (e.g., pixels having gray scale, RGB or other values) and manipulate any quantity of pixel values within any quantity of images in any desired fashion to determine the threshold, light conditions and presence and location of a beam impact. The system may utilize any quantity of thresholds each associated with a region of any size or shape. The threshold offset may be of any desired values based on the user desired sensitivity.

The indicia indicating beam impact locations and other information may be of any quantity, shape, size or color and may include any type of information. The indicia may be placed at any locations and be incorporated into or overlaid with the target images or video. The system may produce any desired type of display or report having any desired information. The computer system may determine scores or other activity information based on any desired criteria. The computer system may poll the sensing device or the sensing device may transmit images and/or coordinates at any desired intervals for the tracing and/or plot modes or sensing functions. The sensing device may detect the laser beam continuously for any desired interval to initiate the tracing and/or plot modes. The indicia for the tracing and/or plot modes may be of any quantity, shape, size or color and may include any type of information. The tracing indicia may be placed at any locations and be incorporated into or overlaid with the target images. The tracing indicia may be flashing or continuously appearing on the display. The trace and/or plot modes may be implemented with any of the screens described above and may display any quantity of previous impact locations to show movement of the firearm.

The system may be configured for use with a transmitter emitting a laser beam having any desired pulse width, and may provide any type of message or other indication when the pulse width of a laser beam detected by the system is not compatible with the system configuration. The system may be configured to detect and process beam impact locations at any desired shot rate. The systems may utilize any conventional or other techniques to convert between the various image spaces, and may compensate for any desired sensing device position and/or viewing angle. The system may be utilized with targets scaled in any fashion to simulate conditions at any desired ranges, and may utilize lasers having sufficient power to be detected at any desired scaled range.

The calibrations for the sensing device (e.g., target alignment, light sensitivity, etc.) may be initiated by a user as described above, or may be performed by the sensing device and/or computer system periodically or in response to detection of conditions (e.g., light conditions, detection of target position or orientation, etc.). The calibrations (e.g., target alignment, light sensitivity, etc.) may be performed individually or in any combination or order. The target alignment may be performed in conjunction with user specified points as described above. Alternatively, the calibration may be performed automatically (without the user specifying the calibration points). In this case, the user enables the image sensing device to view the entire target, where the sensing device and/or computer system employs conventional or other image recognition techniques to determine (e.g., by shape, color or other criteria) any suitable calibration points (e.g., the corners, midsections and/or center of the image). The calibration points may be utilized in the calibration to compensate for perspective and radial distortions in the manner described above.

It is to be understood that the terms “top”, “bottom”, “side”, “upper”, “lower”, “front”, “rear”, “horizontal”, “vertical”, “right”, “left” and the like are used herein merely to describe points of reference and do not limit the present invention to any specific configuration or orientation.

The present invention is not limited to the applications disclosed herein, but may be utilized for any type of firearm training, qualification, competition, gaming or entertainment applications.

From the foregoing description, it will be appreciated that the invention makes available a novel sensing device for a firearm laser training system and method of simulating firearm operation with various training scenarios, wherein a firearm laser training system accommodates various types of targets for facilitating a variety of firearm training activities.

Having described preferred embodiments of a new and improved sensing device for a firearm laser training system and method of simulating firearm operation with various training scenarios, it is believed that other modifications, variations and changes will be suggested to those skilled in the art in view of the teachings set forth herein. It is therefore to be understood that all such variations, modifications and changes are believed to fall within the scope of the present invention as defined by the appended claims.

Claims

1. A firearm laser training system enabling a user to project a laser beam toward a target to simulate firearm operation comprising:

a sensing device to produce images of said target including impact locations of said laser beam and to detect said impact locations, wherein said sensing device includes: a control unit to process said target images and compensate for at least one of a sensing device viewing angle relative to said target and image distortion to determine said impact locations; and
a processor to receive from said sensing device information associated with said impact locations detected by said sensing device, wherein said processor includes an evaluation module to process said received information and to display information relating to said detected impact locations.

2. The system of claim 1, wherein said impact location information from said sensing device includes coordinates of detected impact locations within said target images.

3. The system of claim 1, wherein said target includes a plurality of zones with each zone representing an intended target site and associated with a score value, and wherein said evaluation module includes a scoring module to determine impact scores for a user performance with each impact score associated with a detected impact location and based on said score value of said zone containing that detected impact location.

4. The system of claim 1, wherein said control unit includes:

a detection module to identify said impact locations within said target images based on image pixel values exceeding a threshold.

5. The system of claim 4, wherein said control unit further includes:

a threshold module to automatically determine said threshold in response to measured light conditions of a surrounding environment.

6. The system of claim 1, wherein said processor further includes:

a calibration module to enable a user to indicate a target space associated with said target within a target image.

7. The system of claim 6, wherein said calibration module includes:

an alignment module to enable said user to overlay a grid representing a target space on said target image to indicate said target within said target image.

8. The system of claim 6, wherein said calibration module enables said user to indicate a plurality of reference points on said target within said target image associated with said target space.

9. The system of claim 9, wherein said plurality of reference points includes nine points.

10. The system of claim 1, wherein said control unit includes:

a calibration module to correlate a target space indicated by said user with a target space associated with said target, wherein said calibration module includes a matrix module to produce a translation matrix correlating said indicated and target spaces and accounting for at least one of said sensing device viewing angle relative to said target and said image distortion, wherein said translation matrix is applied to determine said impact locations.

11. The system of claim 10, wherein said sensing device further includes:

a storage unit to store information pertaining to said correlation of said indicated and target spaces.

12. The system of claim 1, wherein said evaluation module includes:

a display module to display an image of said target with indicia indicating said detected impact locations on said target.

13. The system of claim 12, wherein dimensions of said indicia are adjustable to simulate a user-specified caliber of a projectile.

14. The system of claim 1, wherein said evaluation module includes:

a dispersion module to determine a dispersion pattern for a projectile simulated by said laser beam based on said detected impact locations.

15. The system of claim 14, wherein said simulated projectile includes shotgun pellets.

16. The system of claim 3, wherein said scoring module includes:

a session scoring module to determine a session score for a user by combining impact scores of detected impact locations.

17. The system of claim 3, wherein said scoring module accesses a target file associated with said target including score values associated with each of said zones and said processor stores a plurality of target files associated with a plurality of targets that are accessible to said scoring module.

18. The system of claim 1, wherein said sensing device further includes at least one port to transfer signals with an external device.

19. The system of claim 18, wherein said external device includes at least one of a mechanism to simulate return fire and an audio device to provide sound effects.

20. The system of claim 18, wherein at least one port includes a network port to communicate with said processor.

21. The system of claim 1, wherein said target includes at least one of an image on a material, an image produced by a projector, an image displayed on a display device, a video produced by a projector and a video displayed by a display device.

22. The system of claim 1, wherein said sensing device further includes:

a signature detector to detect a pattern within said projected laser beam to identify an authorized system user associated with a beam impact.

23. A sensing device for a firearm laser training system that enables a user to project a laser beam toward a target to simulate firearm operation, said sensing device comprising:

an image sensor to produce images of said target including impact locations of said laser beam; and
a control unit to detect said impact locations, wherein said control unit processes said target images and compensates for at least one of a sensing device viewing angle relative to said target and image distortion to determine said impact locations.

24. The sensing device of claim 23, wherein said determined impact locations include coordinates of detected impact locations within said target images.

25. The sensing device of claim 23, wherein said control unit includes:

a detection module to identify said impact locations within said target images based on image pixel values exceeding a threshold.

26. The sensing device of claim 25, wherein said control unit further includes:

a threshold module to automatically determine said threshold in response to measured light conditions of a surrounding environment.

27. The sensing device of claim 23, wherein said control unit includes:

a calibration module to correlate a target space indicated by said user with a target space associated with said target, wherein said calibration module includes a matrix module to produce a translation matrix correlating said indicated and target spaces and accounting for at least one of said sensing device viewing angle relative to said target and said image distortion, wherein said translation matrix is applied to determine said impact locations.

28. The sensing device of claim 27 further including:

a storage unit to store information pertaining to said correlation of said indicated and target spaces.

29. The sensing device of claim 23 further including at least one port to transfer signals with an external device.

30. The sensing device of claim 29, wherein at least one port includes a network port to communicate with said external device.

31. The sensing device of claim 23 further including:

a signature detector to detect a pattern within said projected laser beam to identify an authorized user of said firearm laser training system and facilitate association of said identified user with a corresponding beam impact.

32. A method of simulating firearm operation, wherein a user projects a laser beam toward a target, said method comprising:

(a) producing images of said target including impact locations of said laser beam via a sensing device;
(b) detecting beam impacts on said target, via said sensing device, by processing said target images and compensating for at least one of a sensing device viewing angle relative to said target and image distortion to determine said impact locations; and
(c) processing information pertaining to said detected impact locations via a processor coupled to said sensing device and displaying information relating to said detected impact locations.

33. The method of claim 32, wherein step (b) further includes:

(b.1) determining coordinates of detected impact locations within said target images.

34. The method of claim 32, wherein said target includes a plurality of zones with each zone representing an intended target site and associated with a score value, and step (c) further includes:

(c.1) determining impact scores for a user performance with each impact score associated with a detected impact location and based on said score value of said zone containing that detected impact location.

35. The method of claim 32, wherein step (b) further includes:

(b.1) identifying said impact locations within said target images based on image pixel values exceeding a threshold.

36. The method of claim 35, wherein step (b.1) further includes:

(b.1.1) automatically determining said threshold in response to measured light conditions of a surrounding environment.

37. The method of claim 32, wherein step (a) further includes:

(a.1) calibrating said sensing device by enabling a user to indicate a target space associated with said target within a target image.

38. The method of claim 37, wherein step (a.1) further includes:

(a.1.1) enabling said user to overlay a grid representing a target space on said target image to indicate said target within said target image.

39. The method of claim 37, wherein step (a.1) further includes:

(a.1.1) enabling said user to indicate a plurality of reference points on said target within said target image associated with said target space.

40. The method of claim 39, wherein said plurality of reference points includes nine points.

41. The method of claim 32, wherein step (b) further includes:

(b.1) calibrating said sensing device by producing a translation matrix correlating a target space indicated by said user with a target space associated with said target and accounting for at least one of said sensing device viewing angle relative to said target and said image distortion, wherein said translation matrix is applied to determine said impact locations.

42. The method of claim 41, wherein step (b.1) further includes:

(b.1.1) storing information pertaining to said correlation of said indicated and target spaces within said sensing device to enable detection of impact locations without recalibrating said sensing device.

43. The method of claim 32, wherein step (c) further includes:

(c.1) displaying an image of said target with indicia indicating said detected impact locations on said target.

44. The method of claim 43, wherein dimensions of said indicia are adjustable to simulate a user-specified caliber of a projectile.

45. The method of claim 32, wherein step (c) further includes:

(c.1) determining a dispersion pattern for a projectile simulated by said laser beam based on said detected impact locations.

46. The method of claim 45, wherein said simulated projectile includes shotgun pellets.

47. The method of claim 34, wherein step (c.1) further includes:

(c.1.1) determining a session score for a user by combining impact scores of detected impact locations.

48. The method of claim 34, wherein step (c.1) further includes:

(c.1.1) accessing a target file associated with said target including score values associated with each of said zones, wherein said processor stores a plurality of target files associated with a plurality of targets.

49. The method of claim 32, wherein said sensing device includes at least one port to transfer signals with an external device.

50. The method of claim 49, wherein said external device includes at least one of a mechanism to simulate return fire and an audio device to provide sound effects.

51. The method of claim 49, wherein at least one port includes a network port to communicate with said processor.

52. The method of claim 32, wherein said target includes at least one of an image on a material, an image produced by a projector, an image displayed on a display device, a video produced by a projector and a video displayed by a display device.

53. The method of claim 32, wherein step (b) further includes:

(b.1) detecting a pattern within said projected laser beam to identify an authorized user associated with a beam impact.
Patent History
Publication number: 20070190495
Type: Application
Filed: Dec 21, 2006
Publication Date: Aug 16, 2007
Inventors: O. Kendir (Eldersburg, MD), Rifat Yildirim (Eldersburg, MD)
Application Number: 11/642,589
Classifications
Current U.S. Class: 434/21.000
International Classification: F41G 3/26 (20060101);