DYNAMIC LOCAL REGISTRATION SYSTEM AND METHOD
In accordance with the teachings described herein, systems and methods are provided for generating images for use in systems, e.g., imaging systems. The method includes receiving at least a first set of images, providing a first registration, providing a display, and displaying a first image on said display. Further, the method includes providing a user interface, providing a second registration, and displaying a second image in said user interface. Further, the systems include an image database, a display, and a registration engine. The registration engine includes software instructions stored in at least one memory device and executable by one or more processors.
Latest MIM Software Inc. Patents:
The present invention relates generally to the field of image processing and more particularly to image registration.
BACKGROUNDThree dimensional medical scans of patients, such as CT (computed tomography), MR (magnetic resonance), US (ultra sound), or PET (positron emission tomography), produce a series of two dimensional (2D) image slices that together make up 3D images. The two dimensional image slices can be stored to create a set of primary or target images. A medical professional may take another set of images of the patient that can be stored in memory to create a set of secondary or source images. These secondary or source images may be compared with the primary or target images, for example, and processed through an image registration to define a point to point correlation between the primary and the secondary images. However, image registrations, particularly high order registrations including deformable registrations, of any volume may be difficult to interpret and evaluate for accuracy, and doing so can be time consuming for medical professionals.
The aforementioned difficulty in understanding and evaluation of registrations is not ideal. Accordingly, a new system and method is desired.
SUMMARYIn accordance with the teachings described herein, systems and methods are provided for generating images for use in systems, e.g., imaging systems. In one example, the method may include receiving at least a first set of images, providing a first registration, providing a display, displaying a first image on said display, providing a user interface, providing a second registration, and displaying a second image in said user interface. In one example, the system may include an image database, a display, a registration engine, wherein at least said registration engine comprise software instructions stored in at least one memory device and executable by one or more processors.
The features, functions, and advantages discussed can be achieved independently in various embodiments of the present invention or may be combined in yet other embodiments, further details of which can be seen with reference to the following description and drawings.
The following detailed description is merely illustrative in nature and is not intended to limit the embodiments of the invention or the application and uses of such embodiments. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary or the following detailed description.
The imaging software 155 may include at least one image rendering engine 155A, at least one image deformation engine 155B, at least one registration engine 155C or algorithm, at least one user input engine 155D, and at least one user confidence engine 155E. In another embodiment, the imaging software may include at least one registration evaluation engine. It should be understood that the at least one image deformation engine, the at least one image rendering engine, and the at least one registration engine or algorithm, at least one user input engine, and at least one user confidence engine as described herein, may be implemented by software instructions executing on one or more processing devices. In another embodiment, however, one or more operations of these software engines may instead be performed by other known mechanisms such as firmware or even appropriately designed hardware. The image database/memory 115, as described herein, may be implemented using one or more memory devices. For instance, in one example the image database may be implemented within a memory device that contains another database and/or the like, and in another example the image database may be implanted on a stand-alone or separate memory device. As discussed below in greater detail, the medical images are loaded into the image database/memory 115 for computing registrations and for quality evaluation that can be used by the user to more easily evaluate registrations that are often very complex for a user to understand. The plurality of medical images may include a set of two-dimensional (2D) slices that are received, for example, from a CT scanner or other system for capturing three-dimensional (3D) medical images, such that the set of 2D slices together represent a 3D medical image. In other examples, the plurality of medical images slices could be virtual, such as sagittal, coronal, or axial images (or any other slicing angle through the image data). In another embodiment, the plurality of images may be used for two dimensional analysis.
In the illustrated embodiment, operating environment 105 may include network 160, gateway 165, internet 170, and/or server 175. Operating environment may include any type and/or number of networks, including wired or wireless internet, cellular network, satellite network, local area network, wide area network, public telephone network, cloud network, and/or the like. In another embodiment, the operating environment operates locally on the computer device. In the illustrated embodiment, computer device 100 may communicate with operating environment 105 through server 175 by a wireless network connection and/or a wired network connection. Further, server 175 may connect computer device 100 to the public telephone network to enable telephone functionality (voice and data) of the computer device 100.
A computer device 100 and operating environment 105 illustrate one possible hardware configuration to support the systems and methods described herein, including at least the methods 900-1200 discussed below. In order to provide additional context for various aspects of the present invention, the following discussion is Intended to provide a brief, general description of a suitable computing environment in which the various aspects of the present invention may be implemented. Those skilled in the art will recognize that the invention also may be implemented in combination with other program modules and/or as a combination of hardware and software. Generally, program modules include routines, programs, components, data structures, etc., that perform particular tasks or implement particular abstract data types.
Moreover, those skilled in the art will appreciate that the inventive methods may be practiced with other computer system configurations, including single-processor or multiprocessor computer systems, minicomputers, mainframe computers, as well as personal computers, hand-held computing devices, microprocessor-based or programmable consumer electronics, and the like, each of which may be operatively coupled to one or more associated devices. The illustrated aspects of the invention may also be practiced in distributed computing environments where certain tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
The computer device 100 can utilize an exemplary environment for implementing various aspects of the invention including a computer, wherein the computer includes a processing unit, a system memory and a system bus. The system bus couples system components including, but not limited to the system memory to the processing unit. The processing unit may be any of various commercially available processors. Dual microprocessors and other multi-processor architectures also can be employed as the processing unit.
The system bus can be any of several types of bus structure including a memory bus or memory controller, a peripheral bus and a local bus using any of a variety of commercially available bus architectures. The system memory can include read only memory (ROM) and random access memory (RAM) or any memory known by one skilled in the art. A basic input/output system (BIOS), containing the basic routines that help to transfer information between elements within the computer device 100, such as during start-up, is stored in the ROM.
The computer device 100 can further include a hard disk drive, a magnetic disk drive, e.g., to read from or write to a removable disk, and an optical disk drive, e.g., for reading a CD-ROM disk or to read from or write to other optical media. The computer device 100 can include at least some form of computer readable media. Computer readable media can be any available media that can be accessed by the computer device. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer device 100.
Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer readable media.
A number of program modules may be stored in the drives and RAM, including an operating system, one or more application programs, other program modules, and program data. The operating system in the computer device 100 can be any of a number of commercially available operating systems and/or web client systems.
In addition, a user may enter commands and information into the computer device through a touch screen and/or keyboard and a pointing device, such as a mouse. Other input devices may include a microphone, an IR remote control, a track ball, a pen input device, a joystick, a game pad, a digitizing tablet, a satellite dish, a scanner, or the like. These and other input devices are often connected to the processing unit through a serial port interface that is coupled to the system bus, but may be connected by other interfaces, such as a parallel port, a game port, a universal serial bus (“USB”), an SR interface, and/or various wireless technologies. A monitor or other type of display device, may also be connected to the system bus via an interface, such as a video adapter. Visual output may also be accomplished through a remote display network protocol such as Remote Desktop Protocol, VNC, X-Window System, etc. In addition to visual output, a computer typically includes other peripheral output devices, such as speakers, printers, etc.
A display can be employed with the computer device 100 to present data that is electronically received from the processing unit. In addition to the descriptions provided elsewhere, for example, the display can be an LPD, LCD, plasma, CRT, etc. monitor that presents data electronically. The display may be integrated with computer device 100 and/or may be a stand-alone display. Alternatively or in addition, the display can present received data in a hard copy format such as a printer, facsimile, plotter etc. The display can present data in any color and can receive data from the computer device 100 via any wireless or hard wire protocol and/or standard.
The computer device can operate in a networked environment using logical and/or physical connections to one or more remote computers/devices, such as a remote computer(s). The remote computer(s)/device(s) can be a workstation, a server computer, a router, a personal computer, microprocessor based entertainment appliance, a peer device or other common network node, and typically includes many or all of the elements described relative to the computer. The logical connections depicted include a local area network (LAN) and a wide area network (WAN). Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
When used in a LAN networking environment, the computer device is connected to the local network through a network interface or adapter. When used in a WAN networking environment, the computer device typically includes a modem, or is connected to a communications server on the LAN, or has other means for establishing communications over the WAN, such as the Internet. In a networked environment, program modules depicted relative to the computer, or portions thereof, may be stored in the remote memory storage device. It will be appreciated that network connections described herein are exemplary and other means of establishing a communications link between the computers may be used.
Further in
In the illustrated embodiment, overlay 204 is a graphical user interface that displays a second image 206 having a localization point 208 (represented in the figure by a plus (+)) surrounded by a localization region or neighborhood 210. For example, the localization point 208 may be at the center of a localization region or neighborhood 210 having a 5×5×5 voxel volume or alternatively may be a 1×1×1, 2×2×2, 3×3×3, or 4×4×4 voxel volume. In another embodiment, the localization region or neighborhood may be larger than a 5×5×5 voxel volume. In still another embodiment, the localization region or neighborhood may be a different shape, such as a sphere, circle, or square. The overlay 204 may further include first and second axes 212 which extend from localization point 208 past the perimeter of the overlay 204 to first image 202. Overlay 204 is described as a user interface because a software user can use a computer input, e.g., a mouse, touch screen, keyboard, and the like, as an input to the software to change location of the overlay 204 and the localization point 208, provide input and/or instructions to the software, select different modes of display or interaction, and to record qualitative comments and quantitative metrics regarding dynamically selected localization point(s) for the various embodiment image(s) as discussed further herein. Of course, one skilled in the art of software appreciates that other user interfaces and the like may be included with the images discussed and illustrated herein, e.g., companion controls, color bars or side panels that provide some of the control discussed herein.
In the illustrated embodiment of
An image processed through a complex or higher order registration, such as deformable registration, may be difficult for a user to understand and rely on because of the difficulty of inspecting the registration for the purpose of gaining confidence that the registration is correct or at least portions of the registration are correct, e.g., it is difficult for user to perceive whether a voxel in bone or other tissue in the primary image is the same voxel in bone or other tissue in the secondary image. For example, a point may go virtually anywhere in a deformable registration because it can have many degrees of freedom, making it hard for a user to determine the quality and/or reliability of the registration. And in fact, a single registration may be accurate in some areas and inaccurate in other areas. In some cases, these registrations may have more than a million degrees of freedom. For all of these reasons, the user reviewing images processed with higher order registrations may not have a high level of confidence in the quality of the registration(s). This makes communication between one user who reviews images and assesses an image and/or registration quality (such as a physicist in radiation oncology department or imaging lab) and another user who may review image results and prescribe a patient diagnosis or treatment plan (such as a physician) especially inefficient and difficult. On the other hand, local rigid registrations have six degrees of freedom (3 in translation and 3 in rotation) and may be reviewed and understood, visually or otherwise, by a user with a higher level of confidence. Additionally, it is easier to perceive accurate point-by-point mapping and correlation through these simpler registrations that display clearly understandable context around each point. Therefore, as an example, in the systems and methods described herein, the higher order registrations may be approximated by a best fit local rigid registration and the secondary image processed through the best fit local rigid registration for each localization point selected by the user. If the best fit local rigid registration is determined to be acceptable by the user, the user will have a higher level of confidence that the higher order registration close to the localization point has a good level of quality, e.g, the points in each image near the localization do indeed correlate, and the image processed with the higher order registration may likely be relied on by the user, e.g., the physicist and/or the physician. On the other hand, if the best fit local rigid registration is determined to be unacceptable to the user, the user may record this as a bad result, rerun the registration in order to improve the local area, or reject the result in this area.
In
In another embodiment, multiple display modes may be further blended together in the spyglass. For example, the primary and secondary blend as shown in
Using the first set of images and the second set of images, at least a first registration algorithm runs at 915 and provides the at least first registration at 920. For example, the at least first registration may be a deformable registration. In another embodiment, a first registration may be loaded from another source, e.g., another memory location in another system. In yet another embodiment, the at least first registration is not a deformable registration. For example, the at least first registration could be a rigid registration. At least a first image is displayed at 925, at least one user interface overlay or spyglass is provided on at least a portion of the displayed image at 930, and a localization point or position is defined or selected by a user in real time at 935. At least a second registration algorithm is optionally run and at least a second registration is provided at 940. For example, at least a second registration algorithm may approximate the first registration with a best fit lower order registration; e.g., the at least a second registration may be a best fit local rigid registration. In another embodiment, the at feast second registration algorithm may approximate the first registration with a global rigid registration. As discussed above, the purpose of providing a second registration or a best fit local rigid registration is to allow the user to see at least one lower order rigid registration to help the user understand the first registration or the complex, higher order deformable registration. In another embodiment, the at least one second registration might be a registration independently computed by optimizing similarity between the primary and secondary images in a local region rather than a best fit registration which is an approximation to the first registration. This at least second registration may be a rigid registration or may be a higher order registration. In yet another embodiment, the first registration may be a lower order registration and the at least second registration may be a higher order registration computed using an algorithm to optimize the image matching between the primary and secondary images. This embodiment would enable the user to determine the quality of the lower order registration in comparison with the higher order registration which may be more accurate at very local image fitting, but may have other negative trade-offs which make it less desirable, such as inaccuracies in certain parts of the registration. In these embodiments, the purpose of the at least second registration is as a general comparison to the first registration as opposed to some embodiments where the at least second registration is used to better understand the first registration. In addition, both registrations could be scored based on their quality or other properties and this record could be fed back into a registration algorithm to regenerate a new registration.
At 945, at least a portion, an aspect, or a property of at least one registration or at least one image is displayed over the first image in the user interface overlay (spyglass) about the at least one localization point of the overlay. For example, the second image processed through second registration may be displayed in the overlay over a first image that contains a primary image (e.g.,
A user may redefine the localization point or position at 950, triggering the dynamic and user controlled rerunning of at least a second registration and providing at least a second registration at 940, which is then displayed over the first image in the user interface overlay (spyglass) at 945. Every time the user redefines the localization point or position at 950, the system goes through another pass at 950, 940, and 945. In another embodiment, the system may be configured (programmed) to include an automated process that triggers from 945 back to 940 to rerun the at least a second registration to provide at least a second registration at 940 (e.g. a new or updated best fit local rigid registration) that is then displayed over the first image in the user interface overlay (spyglass) at 945 to help the user evaluate the registration. At 955, user notations in the user interlace overlay or spyglass based on a bad or negative quality assurance or some other review triggers a partial rerun of the method starting at 915 where at least a first registration algorithm is reran, possibly based on these user inputs, or loaded from another system as discussed above. Alternatively, at 955, the user notations could trigger and be used as inputs for a rerun of the method starting at 940 where the at least a second registration algorithm is rerun, possibly based on these user inputs, for example to bias the registration away from had results and to results more acceptable to the user. Alternatively, after 955 the system may come to an end as illustrated in
At 1045, at least a portion, an aspect, or a property of at least one registration or at least one image is displayed over the first image in the user interface overlay (spyglass) about the at least one localization point of the overlay, similar to 945 in
At 1060, a report or metric of at least one localization point is produced that provides a qualitative and/or quantitative evaluation of the registration about local portion(s) of the image(s). In one embodiment, the report contains a series of captured images of reviewed localization points, each with an indication of how the registration(s) reviewed were classified by the user, e.g., as good or bad. In another embodiment, the report is at least partially comprised by a new volume that indicates the classifications which were recorded. This could be represented as a set of images similar to
The embodiments of this invention shown in the drawing and described above are exemplary of numerous embodiments that may be made within the scope of the appended claims. It is understood that numerous other configurations of the graphical user interfaces may be created taking advantage of the disclosed approach. In short, it is the applicant's intention that the scope of the patent issuing herefrom will be limited only by the scope of the appended claims.
Claims
1. A processor-implemented method for use in imaging systems, comprising:
- receiving at least a first set of images;
- providing a first registration;
- providing a display;
- displaying a first image on said display;
- providing a user interface, wherein said user interface is displayed in conjunction with at least a portion of said first image;
- providing a second registration; and
- displaying a second image in said user interface.
2. The method of claim 1, wherein said user interface includes a localization point.
3. The method of claim 1, wherein said first registration or said second registration is generated by at least a first registration algorithm.
4. The method of claim 1, wherein said second registration is a low order registration.
5. The method of claim 1, wherein said first registration is a high order deformable registration.
6. The method of claim 1, wherein said second registration approximates said first registration.
7. The method of claim 1, wherein said second image graphically illustrates differences between said first registration and said second registration.
8. The method of claim 7, wherein said second image is a heat map.
9. The method of claim 8, wherein said heat map displayed on said display includes at least one of the following: a plurality of colors, a plurality of cross hatchings, and a plurality of graphical images.
10. The method of claim 1, wherein said second image is registered by said first and second registrations to said first image.
11. The method of claim 10, wherein said second image is checkered with said first image.
12. The method of claim 1, wherein said second image is blended with said first image.
13. The method of claim 12, wherein said blended image includes about 50% of said first image.
14. The method of claim 1, wherein the said second image includes a blend having a heat map and an image composite.
15. The method of claim 3, wherein said user interface receives at least one adjustment input for rerunning said at least first registration algorithm.
16. The method of claim 15, wherein said user interface displays an updated second image.
17. The method of claim 1, wherein said user interface receives highlighting.
18. The method of claim 1, wherein said user interface receives at least one of the following inputs: good, poor, or other notation.
19. The method of claim 3, wherein a user dynamically selects at least a first localization point and said at least a first registration algorithm provides an updated second image in said user interface.
20. The method of claim 1, further including providing a recording of a user notation of said at least first registration for at least one area.
21. The method of claim 1, further providing a report of a user classification of said at least first registration for at least one area.
22. A processor-implemented method for use in imaging systems, comprising:
- receiving a first set of images;
- receiving a second set of images;
- providing a display;
- displaying a first image on said display;
- providing a user interface having a first localization point, wherein said user interface is displayed over at least a portion of said first image;
- running at least a first registration algorithm to provide a first registration;
- displaying a second image in said user interlace, wherein said first registration is applied to at least a portion of said second image.
23. A method of claim 22, wherein a user can classify said first registration in at least one area.
24. A method of claim 23, wherein a global registration includes at least said first registration.
25. A method of claim 22, where a user can segment structures of interest based on at least said secondary image.
26. A system for generating an image, comprising:
- an image database, said image database including at least a first set of images;
- a display, said display configured to display at least one image having at least one user interface; and
- a registration engine, said registration engine configured to load or generate at least a first registration;
- wherein at least said registration engine comprise software instructions stored in at least one memory device and executable by one or more processors.
27. The system of claim 26, further comprising a user confidence engine and a registration evaluation engine.
28. The system of claim 27, wherein said user confidence engine is configured to record a user classification input as a user reviews said at least one image having at least one user interface.
29. The system of claim 27, wherein said user confidence engine is configured to generate a user confidence report.
30. The system of claim 26, further comprising a user input engine, said user input engine configured to receive user input and output user input to at least said registration engine.
31. A processor-implemented method for use in imaging systems, comprising:
- receiving at least a first set of images;
- providing a first registration;
- providing localization points; and
- using at least a first registration algorithm to compare registrations at said localization points.
32. The method of claim 31, wherein the at least first registration algorithm compotes registrations that are approximations to said first registration near said localization point.
33. The method of claim 31, wherein said localization points are predetermined or user adjustable.
34. The method of claim 31, wherein said registration using localization points are combined into a global registration.
35. The method of claim 31, wherein a second image is generated from a difference between said first registration and said computed registration.
Type: Application
Filed: Sep 21, 2012
Publication Date: Mar 27, 2014
Applicant: MIM Software Inc. (Cleveland, OH)
Inventors: Jerimy Brockway (Aurora, OH), Jonathan W. Piper (Cleveland Heights, OH)
Application Number: 13/624,827
International Classification: G06K 9/00 (20060101);