Teleradiology systems for rendering and visualizing remotely-located volume data sets
A teleradiology system provides the capability of rendering and studying of a remotely located volume data without requiring transmission of the entire data to the user's local computer. The system comprises: receiving station (300) under the control of a user (400); transmitting station (100); the connecting network (200); the user interface (32) with functionality of controlling volume data rendering, transmission, and display; and the interface with patient data source (10). The teleradiology system of the invention provides an integrated functionality of data transmission of current teleradiology systems and volume data rendering/visualization of current volume data rendering/visualization systems. The system may be readily used with an intranet, the internet (including the internet2) or via a direct dial-up using a telephone line with a modem, and can serve as an enterprise-wide PACS and may be readily integrated with other PACS and image distribution systems. Software of this system can be centrally installed and managed and can be provided to the user's local computer on an as-needed basis. Furthermore, the software for the user's computer is developed to use with standard web browser. This system provides a secure, cost-effective, widely-accessible solution for data rendering and visualization. It provides a suitable image distribution method for medical image data, for medical data repository, and for the electronic medical record (or the computerized patient record). It allows healthcare providers (e.g., radiologists, other physicians, and supporting staffs) to render and study remotely located patient data at the locations of their choices.
Latest Vital Images, Inc. Patents:
- Progressive lossless compression of image data
- Progressive lossless compression of image data
- Hollow object model visualization in medical images
- User interface for providing clinical applications and associated data sets based on image data
- Synthetic visualization and quantification of perfusion-related tissue viability
This application is a continuation reissue application of U.S. patent application Ser. No. 11/229,452, filed Sep. 16, 2005, now allowed, which is a reissue of U.S. application Ser. No. 09/434,088, filed Nov. 5, 1999, now U.S. Pat. No. 6,621,918.
BACKGROUND OF THE INVENTIONThe present invention generally relates to teleradiology systems, specifically to teleradiology systems with remote volume data rendering and visualization capability.
Teleradiology is a means of electronically transmitting radiographic patient images and consultative text from one location to another. Teleradiology systems have been widely used by healthcare providers to expand the geographic and/or time coverage of their service, thereby achieving efficiency and utilization of healthcare professionals (e.g., radiologists) with specialty and subspecialty training and skills, resulting in improved healthcare service quality, delivery time, and reduced cost.
Existing teleradiology systems have been designed for, and are only capable of, transmitting two-dimensional (2D) images in a predetermined order, similar to a fax machine, faxing page by page in a predetermined order. Prior art includes U.S. Pat. No. 4,748,511 by Nichols et al, U.S. Pat. No. 5,291,401 by Robinson, and many related patents. None of them is optimized for volume data rendering and study.
Data rendering refers to the process of converting data into visual forms so that the information in the data can be understood and interpreted. These visual forms are usually shown on a two-dimensional monitor, film or even paper. Data visualization refers to the process of displaying and studying the rendering results. Two-dimensional data rendering and visualization is straightforward, as a 2D (M×N) data array can be readily presented as a 2D (M×N) image which can be displayed (e.g., on a monitor) or printed (e.g., on a film or paper). However, visualizing data of more than two-dimensions is a much more complex task. We refer to a data set with more than two dimensions as volume data.
Visualizing volume data requires volume data rendering methods. In general, a volume data rendering method reduces or converts an original volume data set into a synthesized data set of different forms, i.e., with reduced dimensions and with different data attributes. For example, one method of 3D volume data rendering is called Multi-Planer Reformation (MPR), which is derived from the data within a slice of the 3D data “cube” by averaging the data along the direction perpendicular to the slice. In this way, MPR reduces 3D data into a 2D image, presenting averaged data values in the slice. As the rendering parameters (e.g., the locations, orientations, and thickness of the slice) change, different 2D images (averaged data values) of the 3D dataset are obtained. With MPR, one can view images of any oblique slice in addition to conventional horizontal slices. Another method of volume data rendering is called Maximum Intensity Projection (MIP), where the intensity of each pixel in the MIP image is the maximum intensity encountered in the 3D dataset along each of the parallel or divergent paths defined by viewpoint. Besides MPR and MIP, volume data rendering methods of medical interest also include surface rendering and volume rendering, as well as many variations and/or combinations of these methods. For technical details of these data rendering methods, reference may be made to the review article, “3D displays for computed tomography”, by Sandy Napel, p. 603-626, in the book entitled “Medical CT and Ultrasound: current technology and applications” published by Advanced Medical Publishing, 1995.
Medical image acquisition techniques include X-ray, Computed Tomography (CT), Magnetic Resonance (MR), UltraSound (US), and Nuclear Medicine. Nuclear Medicine further includes Single Photon Emission Computed Tomography (SPECT) and Position Emission Tomography (PET).
In modem medical diagnosis and treatment planning, acquisition of volume data becomes a rule rather than an exception. Thus, volume data rendering and visualization methods have become essential methods, in addition to the traditional slice-by-slice 2D image studies. For example, the de facto standard for CT angiography image display is MIP, which results in a 2D image highlighting the vascular structures. The volume data rendering result is usually obtained by interactively adjusting the rendering parameters, such as the viewpoint (i.e., the orientation), the spatial region and/or the value range of interest of the volume data.
There are many volume data rendering/visualization systems (including software and hardware). Prior art includes U.S. Pat. 4,737,921 by Goldwasser et al., U.S Pat. No. 5,649,173 by Lentz, and many related patents. In order to improve the graphics performance, the current volume data rendering/visualization systems have been designed as local dedicated systems, rather than as network based systems.
Currently, volume data rendering and visualization can only be done when the data to be rendered as well as the required rendering/visualization software and hardware are resided in the computer which is used to perform this task. If a user wants to obtain the volume data rendering result for a remotely located data set, he/she has to 1) transmit the entire volume data set from the remote location to his local computer via a network; 2) generate the rendering result from the local copy of the data and display the result, using the rendering/visualization software and hardware installed on his local computer. This approach, referred to as the two-step (i.e., transmitting and rendering/visualizing) approach, is often impractical and undesirable for the following reasons:
- 1) This approach requires transmitting a large volume data set (e.g., 150 MB in a CT angiography study) over a network, which frequently is not practical for even normal networks (such as, the Ethernet) available in a hospital setting. It is even less practical for a direct dial-up (from home) using a telephone line with a modem.
- 2) This approach causes a long initial delay because it takes a long time to transmit a large data set over a network, and the rendering and study cannot be started until transmission of the entire data set is completed. This delays the delivery of healthcare service.
- 3) This approach is costly because performing volume data rendering/visualization this way imposes stringent requirements on the network as well as the hardware (e.g., memory, storage, and processing power) and software (special for volume data rendering/visualization) of the user's local computer.
- 4) This approach, because of the high cost, cannot be deployed in a large scale, and therefore cannot serve as a healthcare enterprise-wide image distribution solution.
- 5) This approach cannot provide ubiquitous access and distribution of images to the points of the user's choice, as it can only provide image access via limited designated points of access.
- 6) Medical images are not used in a vacuum. Clinicians integrate the information derived from imaging studies with other clinical data (such as ECG, the blood pressure, the patient medical history) in order to make patient management decisions. What the clinician requires is ubiquitous access of the so-called electronic medical record, which integrates both image data and other clinical data. The two-step approach, due to its high cost and limited fixed access points, is not a suitable image distribution method for the electronic medical record.
- 7) This approach requires generating local copies of the patient data to be studied, which is often undesirable for patient data management.
Though solving above problems has substantial commercial benefits, no satisfactory solution exists that allows healthcare providers to render and study remotely located volume patient data.
Rendering and visualizing data generated by a remotely located scientific instrument or supercomputer has been studied for several years. Prior art includes U.S. Pat. No. 5,432,871 by Novik and many related patents. Also reference may be made to “Data and Visualization Corridors: Report on the 1998 DVC Workshop Series” by P. H. Smith & J. van Rosendale, California Institute of Technology Technical Report CACR—164 September 1998. The applications taught hereby distinctly differ from teleradiology applications in the following aspects. 1) The objects to be studied are fundamentally different—patient data versus scientific measurements and computations, requiring different rendering/visualization methods as well as different user interactions/interfaces. 2) Teleradiology applications have unique requirements in regard to real-time interactivity and image fidelity. 3) Teleradiology applications require unique attentions to data security (including patient privacy) and data integrity as well as other medical and legal issues. 4) Teleradiology applications require a unique image distribution solution for medical image data and the electronic medical record that is suitable for large scale (e.g., healthcare enterprise-wide) deployment and that is fully integrated with medical image data source and data management.
SUMMARY OF THE INVENTIONThis invention provides a method and apparatus that allow healthcare providers (e.g., radiologists, other physicians, and supporting staffs) to render and study remotely located volume patient data at the locations of their choices. The capability of rendering/visualizing remotely located volume data only becomes available by fully integrating data transmission and volume data rendering functionalities currently supported by two types of products, i.e., teleradiology systems and volume data rendering/visualization systems.
Objects and Advantages of the InventionAn object of the invention is to develop methods and apparatus that allow healthcare providers (e.g., radiologists, other physicians, and supporting staffs) to render and study remotely located patient data at the locations of their choices.
Another object of the invention is to develop methods and apparatus of teleradiology that allows rendering and studying of remotely located patient volume data without transmitting the entire data to the user's local computer.
Another object of the invention is to develop a secure cost-effective healthcare enterprise-wide solution for data rendering and visualization, and for image data distribution.
Another object of the invention is to provide a solution to further integrate (combine) results from different rendering results, from different rendering methods, from different data sets (regardless of whether they are locally or remotely located), and/or, from different image data acquisition methods.
Another object of the invention is to develop methods and apparatus for data rendering and visualization that efficiently utilizes the high-power computer hardware and/or software at remote locations and alleviates the burden on the network as well as on the user's local computer (hardware and/or software).
Another object of the invention is to develop methods and apparatus that allows software to be centrally installed and managed and to be provided to the user's local computer on an as-needed basis. Furthermore, the software can automatically adjust its configuration based on the user input and/or the configuration of the user's local computer and network.
The teleradiology system of the invention provides a healthcare enterprise-wide solution for rendering and visualization of a remotely located data. It substantially overcomes problems of the prior art as described above. In particular, it is extremely cost-effective, ubiquitously accessible, secure and flexible. The teleradiology system of the invention will improve the accessibility, utilization, and therefore applications, of data (in particular, volume data) rendering and visualization in medicine.
These and further objects and advantages of the invention will become apparent from the ensuing specification, taken together with the accompanying drawings.
With reference to
Receiving station 300 comprises a data receiver 26, a send request 22, a user interface 32, a data decompressor 28, a display system 30, a central processing system 24, and, data security 34. Transmitting station 100 comprises a data transmitter 16, a receive request 20, a data compressor 14, a volume data rendering generator 12, a central processing system 18, and, data security 34.
Receiving station 300 is controlled by a user 400 and is typically located at the healthcare professional's office or home. Transmitting station 100 is usually located proximate to an image data source 10 (e.g., proximate to image database and/or archiving of a Radiology department). In some cases, image data source 10 may be included in transmitting station 100.
In a preferred operation, user 400 via user interface 32 specifies, one at a time, 1) at least one image data set to be visualized; 2) at least one data rendering method to be used, 3) the rendering parameters used by each rendering method, and 4) the data transmission parameters for controlling data transmission over network 200. Central processing system 24 on receiving station 300 takes and validates the user request. Central processing system 24 then issues the request, which is sent via send request 22 to transmitting station 100 through network 200. Central processing system 18 on transmitting station 100 receive the request via receive request 20. Coordinated by central processing system 18, volume data rendering generator 12 accesses from image data source 10 the image data set which the user has specified, and then generates the data rendering result based on the data rendering method and parameters which the user has specified. The rendering result is usually a 2D image, much smaller in size than the original data set. Data compressor 14 further compresses the result and other parameters based on data transmission parameters which the user has specified. Then, data transmitter 16 on transmitting station 100 transmits the compressed data to data receiver 26 on receiving station 300 via network 200 based on data transmission parameters which the user has specified. On receiving station 300 and coordinated by central processing system 24, data decompressor 28 decompresses (or restores) the rendering result. (The central processing system 24 may also perform further image processing and operations.) Display system 30 displays the result (the image) and other parameters on user interface 32. Via user interface 32, user 400 can further modify 1) the image data set to be visualized, 2) the data rendering method to be used, 3) the rendering parameters used, and 4) the data transmission parameters used. This process goes on until a satisfactory rendering and visualization result is obtained.
With a well-designed teleradiology system, the response time from a user request to the display of the required result is very short and can be ignored or tolerated. Thus, the user can interactively control data rendering as well as transmission, and visualize the rendering result in “real-time”. Thus, the user can have virtually the same access to the remotely located volume data that he would have if it were the user's computer.
For comparison,
The teleradiology system of the invention (
Volume data rendering and visualization is a well-established field. There are many volume data rendering/visualization systems for medical applications. The data rendering methods of medical interest include multi-planer reformation, maximum intensity projection, surface rendering, volume rendering, as well as many variations and/or combinations of these methods. The rendering parameters include the viewpoint (i.e., the orientation), the spatial region and the value range (e.g., controlled by thresholds) of the data to be rendered. Volume data rendering in medical applications also relies on image processing tools and data editing tools to select spatial regions of the data set to be rendered (e.g., to exclude the bone structures) and to highlight the structure of interest (e.g., the vascular tree). (For more details on volume data rendering and visualization implementation including software implementation, reference may be made to “The Visualization Toolkit—An Object-Oriented Approach to 3D Graphics”, 2nd edition, by Will Schroeder, Ken Martin, Bill Lorensen, published by Prentice Hall PTR, 1998.)
Volume data rendering generator and display system
The data rendering methods cited above are usually computationally intensive. They are implemented on volume data rendering generator 12 (
User interface
User interface 32A of current teleradiology systems (
With the above descriptions on system components, rendering methods, volume data rendering generator, general/special rendering and display hardware, rendering and visualization software, as well as user interface design and functionality, implementing the volume data rendering and visualization aspects of the teleradiology system of the invention should be clear to one with ordinary skill in the volume data rendering/visualization field.
Data TransmissionData transmission is a well-established field. Transmission of medical image data over networks has been widely utilized in teleradiology. Many teleradiology systems are currently available. Teleradiology systems require careful consideration in data transmission media (concerning 200) and protocol (concerning 16,26,20,22 and 32), data compression (concerning 14, 28 and 32), data security (34 and 32), integration with image data source and data management (concerning 10 and 32).
Transmission media and protocol
For the teleradiology system of the invention, the preferred transmission media (i.e., network 200) may be an intranet, the internet (including the internet2) or via a direct dial-up using a telephone line with a modem. The preferred data transmission protocol (for components 16, 26, 20, 22) is the standard TCP/IP. Furthermore, for some transmission media (e.g., the internet2), user 400 can control certain aspects (e.g., the priority level, the speed) of data transmission by selecting transmission parameters via user interface 32. These should be well known to one with ordinary skill in the network communication field.
Data compression/decompression
Data compression is a technique for densely packaging the data to be transmitted to efficiently utilize a given bandwidth of network 200 during transmission. This operation is done by data compressor 14 on transmitting station 100. After transmission of compressed data to receiving station 300, data decompressor 28 restores the compressed data in a format ready to be used. The data compressor 14 and decompressor 28 can be implemented either on dedicated processors for improved response speed or on general propose processors for wide applicability. The wavelet compression/decompression—the de facto standard for data compression—is used on the teleradiology system of the invention as a preferred method. (For technical details on data compression in general and wavelet compression in particular, reference may be made to the book “Wavelets and Subband Coding” by Martin Vetterli and Jelena Kovacevic, published by Prentice Hall, 1995.) Specifically, in one embodiment, user 400 can select data compression and transmission parameters via user interface 32. In another embodiment, these selections are done automatically by the teleradiology system based on the system configuration and the data to be transmitted. For example, the compression method selected can be lossless (i.e., the compressed data can be fully restored) or lossy (i.e., the compressed data can only be partially restored). The attainable data compression ratio is about 3:1 for lossless compression and much higher for lossy compression. The data compression ratio represents a tradeoff of preserving image fidelity (with less compression) versus increasing transmission speed (with more compression). Furthermore, transmitted images can also be refined progressively. Due to medical and legal considerations, the teleradiology system of the invention provides lossless and virtually lossless compressions to avoid misdiagnosis. It also provides progressive refinement for improved interactivity. The image compression/decompression techniques used for the teleradiology system of the invention are similar to that for existing teleradiology systems (i.e., 14A, 28A and 32A in
Medical image data source and management
The teleradiology system of the invention may be readily integrated with medical image data source 10. In particular, medical image data are stored in the Digital Imaging COmmunications in Medicine (DICOM) standards. (For details on DICOM, refer to Digital Imaging Communication in Medicine, Version 3.1. Rosslyn, Va.: National Electrical Manufacturers Association (NEMA) Standards Publication No. 300-1997, 1997.) DICOM is a hierarchical approach to the storage and communication of medical image data. The patient is the top level of this hierarchy. A patient makes visits to a medical service provider, who performs studies concerning this patient. Studies concerning a given patient are composed of study components (e.g., physician's notes concerning the patient, patient identification information, administrative data) and series. Series are in turn composed of radiological images and other related diagnostic information concerning these images. With appropriate access privileges and via user interface 32, the teleradiology system of the invention is able to search image data source 10 on the basis of a patient, a study, a series, or some combination thereof. It is able to save the studies on receiving station 300 and/or transmitting station 100 for future viewing. Furthermore, it is able to capture the consultation messages. In terms of integration with image data source and patient data management, the teleradiology system of the invention is similar to existing teleradiology systems.
Data security and management
Another medical and legal concern of a teleradiology system is its ability to protect patient privacy and data security. Data security 34 includes the security measures for authentication (i.e., proof of identity), access control, confidentiality, and data integrity. (For detailed technical descriptions on data security, reference may be made to the International Organization for Standardization (ISO) security architecture defined in section 5 of ISO/IEC 7498-2, 1989.) As a minimum requirement for the teleradiology system of the invention, name and password are required to identify the authorized user 400 via user interface 32. Access privileges to the teleradiology system in general and to transmitting station 100 in particular are user specific. An audit trail of system resource usage, patient information access, etc. is provided. Encryption of demographics is employed. Firewalls are installed for Internet connections. Data security measures for the teleradiology system of the invention are similar to that for current teleradiology systems (refer to 34A and 32A in
With the above descriptions on data compression/decompression, data security measures, integration with data source and data management, transmission media and protocols, implementing the data transmission aspects of the teleradiology system of the invention should be clear to one with ordinary skill in the field.
New functionalities and capabilitiesOn-demand rendering/transmission control and Rendering remotely located volume data
The teleradiology system of the invention (
With these new functionalities and capabilities, user 400 can navigate through a remotely located volume data set, interactively define and adjust the rendering method and parameters, control what is to be rendered, transmitted and visualized next, and eventually obtain the final rendering result. Thus, user 400 can render and visualize a remotely located volume data set without transmitting the entire volume data set to the user's local computer.
It is to be noted that though medical volume data sets are typically large in size (e.g., 150 MB for a CT Angiography study), in many cases, the user may want to review intermediate and final rendering results only, which are usually much smaller (e.g., of a order of 1 MB) in size. Thus, compared to the current two-step approach, the teleradiology system of the invention greatly alleviates network speed limitations. Furthermore, it eliminates the long initial delay associated with transmitting a large data set over a network, and therefore rendering and visualization can be started almost immediately. It also avoids the problem of generating multiple copies of the data at different locations, which is often desirable for patient data management. With the teleradiology system of the invention, the healthcare providers can further expand the geographic and/or time coverage of their service, resulting in improved healthcare service quality, delivery time, and patient data management, as well as reduced cost.
Different divisions of the rendering generation task
As a preferred embodiment of the invention, the teleradiology system generates the data rendering result exclusively on transmitting station 100, and then transmits the rendering result to receiving station 300. Thus, the hardware (e.g., memory, storage, and computation) demanding operations (e.g., the volume rendering operation) can be performed exclusively on transmitting station 100. This embodiment allows a full utilization of the computer hardware capability at transmitting station 100, and therefore minimizes the hardware requirements on receiving station 300. As a result, users can perform advanced volume data rendering and visualization even with the most basic local computers as receiving stations 300.
In another embodiment of the invention, transmitting station 100 only does a partial computation (e.g., generating a surface rendering model). Central processing system 24 on receiving station 300 completes the remaining part of the computation based on the partial computation done by transmitting station 100, and displays the final rendering result on user interface 32 via display system 30. This embodiment may sometimes further reduce the network load by performing some computations on the user's local computer.
Client-server Software Structure
The teleradiology system of the invention uses client-server design architecture. (For technical details on client-server systems, refer to Dewire DT. Client/Server Computing. McGraw-Hill, 1993.) As a result, software of this system can be installed, maintained, and upgraded on one station, referred to as the software server, while the other station is referred to as the software client. In a preferred embodiment, transmitting station 100 acts as the software server and receiving station 300 as the software client. The software for the software client can be supplied by the software server via network 200 at each time of use. Alternatively, it can be installed on the software client once and for future use. In the latter case, the software client is notified, via network 200, when a software upgrade is available. The client-server software implementation greatly reduces the cost of licensing as well as installing, maintaining, and upgrading software on each software client. The system of the invention also can charge the use of the system on either per license basis, per use basis, or other basis.
Web browser based client software
As a preferred embodiment, the software to be run at receiving station 300 is developed based on standard Web browsers (e.g., Microsoft Explorer or Netscape Navigator). Specifically, the software for receiving station 300 can be a “plug-in” of the Web browser, which is installed once and is an integral part of the Web browser on receiving station 300. Alternatively, the software for receiving station 300 can be a Java applet, which is a program sent from transmitting station 100 each time the program is used. (For technical details on Java applet, refer to Horstmann C S, Cornell G. Core Java, Vol 1: Fundamentals. Sun Microsystems, 1998.) Using Web browser based software makes volume data rendering/visualization software available to any authorized user with a networked computer. Using Java based software makes it work on commonly used operation platforms (e.g., Unix and PC).
The client-server implementation and Web browser based implementation make the proposed system very accessible. Any authorized user can perform advanced volume data rendering and visualization tasks from the user's preferred location using a networked computer. As an example, a user can perform advanced volume data rendering and visualization tasks even with the most basic local computers (e.g., the user's desktop computer) and even without the current volume data rendering/visualization software installed on the user's computer.
Interconnection of multiple receiving/transmitting stations
Though only one receiving station 300 and one transmitting station 100 are shown in
Healthcare enterprise-wide image distribution solution for images, information/data repository, and the electronic medical record
Because it is extremely cost-effective, ubiquitously accessible, provides acceptable data security protection and data management, and significantly relaxes requirements on network as well as the user's local computer (software and hardware), the teleradiology system of the invention is well suited as a healthcare enterprise-wide image distribution solution. Furthermore, it can serve as an enterprise-wide PACS (Picture Archiving Communication System) and can be readily integrated with other PACS and image distribution systems.
By greatly reducing the cost and drastically improving the accessibility of image distribution, the teleradiology system of the invention is a preferred image distribution method for medical image data, for medical information/data repository, and for the electronic medical record (or computerized patient record). Thus, it may be used in settings where the patient data contains not only image data but also other data (e.g ECG) and information (e.g., notes on patient medical history).
Integration and display of multiple rendering results
The teleradiology system of the invention can be used for rendering and visualizing multiple images resulted from different rendering methods and parameters, from different data sets (regardless of whether they are locally or remotely located), and/or different image data acquisition methods (e.g. CT, MR, US). The rendering methods include volume data rendering as well as conventional 2D image rendering. The multiple displays may be updated individually or simultaneously. For example, images of axial, sagittal and coronal multiplaner reformation containing cursor position may be displayed and updated simultaneously as the cursor moves. Furthermore, maximum intensity projection, volume rendering, and/or, surface rendering results may be individually or simultaneously displayed and/or updated with axial, sagittal and/or coronal images. In addition, results from different studies, be it from one or multiple image data acquisition method, can be individually or simultaneously displayed and/or updated for comparison. The different rendering results from different rendering methods, different rendering parameters, different data sets, and/or different image data acquisition methods can be further combined to form one or multiple composite images.
Operation modes and their selections
The system may have many different operation modes. Examples of different operation modes that have been discussed in previous sections include the different divisions of the rendering generation task between receiving station 300 and transmitting station 100, different data compression/decompression operations with different data compression ratios, different data transmission modes for network 200. In general, the different operation modes also require different software configurations. As exemplified in previous discussions, the different operation modes may be selected either by user 400 via user interface 32, or by one or more automated computer program. Using software configurations as an example, the selection of software configurations can be accomplished with user intervention. Alternatively, software can automatically adjust its configuration based on, for example, the configuration of the teleradiology system (network and transmitting/receiving stations) as well as the data rendering task. For example, if the software detects that receiving station 300 has very basic hardware resources (in terms of memory, storage, and/or computation power) for the data rendering task, it automatically uses the software that performs data rendering exclusively on transmitting station 100.
Other embodiments
Although in a preferred embodiment image data source 10 is accessed via transmitting station 100 and transmitting station 100 also acts as the software server, this invention also includes other embodiments. For example, in one embodiment image data source 10 is accessed via transmitting station 100, but receiving station 300 acts as the software server instead. In this case, transmitting station 100 will use the volume data rendering software provided by receiving station 300 via network 200 to generate the rendering result, partially or completely, on transmitting station 100. In another embodiment, transmitting station 100 acts as the software server, but image data source 10 is located proximate to, and accessed via, receiving station 300 instead. In this case, receiving station 300 will use the volume data rendering/visualization software provided by transmitting station 100 via network 200 to generate, completely on receiving station 300, the rendering result.
Obviously, many other modifications and variations of the present invention are possible in light of the above teachings. It is therefore to be understood that within the scope of the disclosed concept, the invention may be practiced otherwise than as specifically described.
Claims
1. A post-processing system for remotely accessing patient information and data previously acquired and electronically stored, and for remotely generating a volume data rendering result, comprising:
- at least one receiving station controllable by at least one user of said system;
- at least one transmitting station physically separated from said receiving station for communicatively coupling to said receiving station through at least one network;
- user interface means provided at said receiving station for enabling a user to specify at least one patient volume data set previously acquired and stored in said transmitting station, and to specify at least one request for volume data rendering comprising specifying a volume data rendering method and rendering parameters to be applied on said volume data set;
- an image processor at said transmitting station interactively controllable at said receiving station to generate a partial or complete volume data rendering result in real time by processing said volume data set using said volume data rendering method and rendering parameters specified by said user;
- a data transmitter provided at said transmitting station for transmitting said processed result to said receiving station; and
- display means for displaying the requested rendering result and rendering parameters at said receiving station.
2. The system of claim 1 further including:
- security and data management means for preventing an unauthorized user from gaining access to said data set from said system.
3. The system of claim 2 wherein:
- said security and data management means further include means for employing firewalls during the data transmission and/or for encrypting demographic of said data set.
4. The system of claim 1 further including:
- Means included in said transmitting station for compressing data to be transmitted;
- means for transmitting said compressed data from said transmitting station to said receiving station through said network; and
- means included in said receiving station for decompressing said transmitted data.
5. The system of claim 4 wherein:
- said compressing means and decompressing means are operable in accordance with each of a plurality of compression/decompression methods, the particular method used being alternatively selected by said user through said user interface, or by an automated computer program.
6. The system of claim 1 wherein:
- said receiving station includes means for computing the remaining part of said rendering result.
7. The system of claim 1 wherein:
- said system's software is installed, managed and upgraded at one of said stations, the software for the other station being alternatively supplied at each time of use over said network or on a permanent basis.
8. The system of claim 1 further including:
- management and software distribution means included in said system for charging the use of said system alternatively on per license basis or on per use basis.
9. The system of claim 1 wherein:
- said system has a plurality of operation modes, the particular operation mode used being alternatively selected by said user through said user interface, or by an automated computer program.
10. The system of claim 1 wherein:
- said receiving station is provided with software which is usable with a web browser.
11. The system of claim 1 wherein:
- said receiving stations comprises one of multiple receiving stations interconnected by said network so that the input and the display at one of said receiving stations can be viewed by other of said receiving stations.
12. The system of claim 1 further including:
- data transmission means for transmitting images with progressive refinement.
13. The system of claim 1 wherein:
- said user interface means includes image processing tools and data editing tools for editing said data set.
14. The system of claim 1 wherein:
- the data transmission is controlled by the transmission parameters, said transmission parameters being alternatively selected by said user through said user interface, or by an automated program.
15. The system of claim 1 wherein:
- said user interface means comprises means for enabling said user to specify different data rendering requests resulting from different rendering parameters, different rendering methods, and/or different data sets from one or multiple data acquisition methods, and
- to specify a method to integrate said different data rendering results into at least one composite rendering result; and
- said display means for presenting at said receiving station said composite rendering result and a plurality of parameters used for generating said composite rendering result.
16. The system of claim 1, wherein said display means, user interface means and image processor are configured for enabling said user to interactively view said displayed requested rendering result and parameters and specify adjusted volume data rendering methods and parameters to generate updated rendering results.
17. The system of claim 1, wherein said transmitting station and said image processor are couplable to a plurality of receiving stations for serving multiple receiving stations concurrently.
18. The system of claim 1, wherein said transmitting station is implemented with a plurality of computers.
19. A method for locally generating a volume data rendering result in accordance with a remote request for processing of previously acquired and locally stored patient information and data, comprising:
- locally storing at least one patient volume data set;
- locally receiving an identification of at least one specified patient volume data set and at least one request for volume data rendering from a remote user, the request for volume data rendering comprising a volume data rendering method and rendering parameters to be applied on said specified patient volume data set;
- locally generating a partial or complete volume data rendering result in real time by processing said specified patient volume data set using said volume data rendering method and rendering parameters received from said remote user; and
- locally transmitting said processed result for remotely displaying said requested rendering result and said rendering parameters.
20. The method of claim 19, further comprising:
- locally receiving new requests for volume data rendering interactively issued by said remote user based on feedback from said displayed requested rendering result and said rendering parameters, said new requests comprising an adjusted volume data rendering method and adjusted rendering parameters;
- locally generating an updated partial or complete volume data rendering result in real time by processing said specified patient volume data set using said adjusted volume data rendering method and adjusted rendering parameters; and
- repeating said new requests for volume data rendering and said generation of said updated partial or complete volume data rendering result until a desired rendering result is achieved.
21. A method, comprising:
- locally receiving, at a computer system from a remote user, a request for volume data rendering for medical imaging volume data, the request including an identification of: a volume data set, a volume data rendering method, and a rendering parameter to be applied on the volume data set, the request generated from a user interface operated by the remote user enabling specification of at least one of a plurality of different rendering parameters, at least one of a plurality of different rendering methods, at least one of a plurality of different data sets from one or multiple data acquisition methods, and a method for integration of the request with at least one additional request into a complete volume data rendering result;
- locally generating, at the computer system, the complete volume data rendering result by processing the volume data set using the volume data rendering method and the rendering parameter received from the remote user; and
- locally transmitting, from the computer system, the complete volume data rendering result to a remote machine for remotely displaying the complete volume data rendering result, the remote machine configured for displaying the complete volume data rendering result and a plurality of parameters used for generating the complete volume data rendering result.
22. The method of claim 21, further comprising:
- locally generating, at the computer system, an updated volume data rendering result in real time by processing the volume data set using at least one of a received adjusted volume data rendering method and a received adjusted rendering parameter.
23. The method of claim 21, wherein locally generating the complete volume data rendering result occurs in real time in response to the request for volume data rendering.
24. The method of claim 21, wherein receiving the request for volume data rendering permits specifying, by the remote user, of a particular volume data rendering method by selecting from a plurality of available volume data rendering methods.
25. The method of claim 24, wherein the available volume data rendering methods include multiplanar reformation (MPR), maximum intensity projection (MIP), and surface rendering.
26. The method of claim 21, further comprising:
- receiving, at the computer system, a new additional request for volume data rendering that is interactively issued by a remote user as a result of viewing the complete volume data rendering result, the new additional request for volume data rendering including a new different volume data rendering method;
- locally generating, at the computer system, a new volume data rendering result by processing the new additional request for volume data rendering using the new different volume data rendering method; and
- locally transmitting, from the computer system, the new volume data rendering result to the remote machine for remotely displaying the new volume data rendering result.
27. A system comprising:
- at least one transmitting station operating as a server, the transmitting station including an image processor interactively controllable by at least one remote client receiving station over a network to generate and provide in a data transmission to the remote client receiving station a complete medical image volume data rendering result in response to a volume data rendering request received by the transmitting station from the remote client receiving station, the volume data rendering request specifying at least one volume data rendering method and at least one rendering parameter to be applied on a volume data set by the transmitting station, wherein the volume data rendering request is generated from a user interface provided by the remote client receiving station enabling specification of at least one of a plurality of different rendering parameters, at least one of a plurality of different rendering methods, and at least one of a plurality of different data sets from one or multiple data acquisition methods, and wherein the volume data rendering request specifies a method to integrate the request into at least one composite rendering result.
28. The system of claim 27, wherein the image processor is operable to implement a plurality of available rendering methods, and wherein the image processor generates the rendering result using the at least one volume data rendering method specified in the volume data rendering request received from the remote client receiving station.
29. The system of claim 28, wherein the plurality of available rendering methods of the image processor include multiplanar reformation (MPR), maximum intensity projection (MIP), and surface rendering.
30. The system of claim 27, wherein the image processor is operable to generate the rendering result by applying a combination of multiple rendering methods on the volume data set.
31. The system of claim 27, wherein the image processor is operable to interactively compute and provide in real-time a first partial or complete rendering result in response to a first volume data rendering request from the remote client receiving station, according to a first volume data rendering method specified in the first volume data rendering request, and wherein the image processor is operable to compute and provide in real-time a second partial or complete rendering result in response to a second volume data rendering request from the remote client receiving station, according to a second volume data rendering method specified in the second volume data rendering request, wherein the first and second volume data rendering requests specify the same or different first and second volume data rendering methods.
32. The system of claim 27, wherein the system is operable to generate the volume data rendering result in real time.
33. The system of claim 27, wherein the data transmission is controlled by at least one transmission parameter, the transmission parameter being alternatively selected by a user through the user interface, or by an automated program.
34. The system of claim 27, wherein the system is operable to receive and process different data rendering requests resulting from different rendering parameters, different rendering methods, or different data sets from one or multiple data acquisition methods, and is operable to integrate the different data rendering results into at least one composite rendering result.
35. A method comprising:
- sending from a client computing system to a remote server a request for volume data rendering, the request including an identification of a volume data set, a volume data rendering method, and a rendering parameter to be applied on the volume data set, the request generated from a user interface provided by the client computing system enabling specification of at least one of a plurality of different rendering parameters, at least one of a plurality of different rendering methods, at least one of a plurality of different data sets from one or multiple data acquisition methods, and a method for integration of the request with at least one additional request into a rendering result;
- receiving at the client computing system from the remote server the rendering result, the rendering result including an at least partial volume data rendering result obtained by processing the volume data set at the remote server using the volume data rendering method and the rendering parameter; and
- displaying the rendering result by the client computing system in a collaborative mode, wherein at least a portion of user input at the client is provided by at least one additional client operating in the collaborative mode.
36. A system, comprising:
- at least one receiving station controllable by at least one user of the system, the receiving station comprising one of multiple receiving stations interconnected by at least one network, and the receiving station communicatively coupled to at least one transmitting station physically separated from the receiving station through the network;
- a user interface provided at the receiving station and enabling a user to specify at least one medical imaging volume data set previously acquired and stored in the transmitting station, and to specify at least one request for volume data rendering specifying a volume data rendering method and rendering parameters to be applied on the volume data set, wherein the volume data rendering request is issued from the user interface and provides specification of: at least one of a plurality of different rendering parameters, at least one of a plurality of different rendering methods, at least one of a plurality of different data sets from one or multiple data acquisition methods, and a method for integration of the request with at least one additional request into a requested rendering result; and
- a display configured to output the requested rendering result and rendering parameters at the receiving station;
- wherein the multiple receiving stations are interconnected by the network so that the output of the display at the receiving station can be viewed by other of the multiple receiving stations.
37. The system of claim 36, wherein the user interface at the receiving station provides interactive control of an image processor at the transmitting station to generate a partial or complete volume data rendering result in real time at the transmitting station by processing the volume data set using the volume data rendering method and rendering parameters specified by the user.
38. The system of claim 36, wherein the system has a plurality of operation modes, the particular operation mode used being alternatively selected by the user through the user interface, or by an automated computer program, and wherein the plurality of operation modes includes a collaborative mode.
39. The system of claim 36, wherein the user interface enables the user to specify different data rendering requests resulting from different rendering parameters, different rendering methods, or different data sets from one or multiple data acquisition methods, and to specify a method to integrate the different data rendering results into at least one composite rendering result; and wherein the display is further configured to output the composite rendering result at the receiving station.
40. The system of claim 36, wherein the display, user interface, and image processor are configured to enable the user to interactively view the requested rendering result and parameters and to specify adjusted volume data rendering methods and parameters to generate updated rendering results.
41. The system of claim 36, wherein the transmitting station and the image processor are coupleable to a plurality of receiving stations for serving multiple receiving stations concurrently.
42. The system of claim 36, wherein the user interface provided at the receiving station permits the user to specify a first volume data rendering method of a plurality of available volume data rendering methods, to display a first rendering result using the first volume data rendering method on the volume data set, and based on viewing the displayed first rendering result, to then specify a second volume data rendering method of the plurality of available volume data rendering methods, and to display a second rendering result using the second volume data rendering method on the volume data set.
43. The system of claim 36, wherein the user interface provided at the receiving station permits the user to select one of several different volume data rendering methods for use on the volume data set, and wherein the user interface provided at the receiving station receives commands from an input device that permits the user to combine the different volume rendering methods used on the volume data set.
44. The system of claim 36, wherein cursor and axial, sagittal, and coronal display images are concurrently displayed and interactively updated on the display as a cursor position moves in the user interface.
45. The system of claim 36, wherein the display permits images from different volume rendering methods or different image data acquisition methods to be simultaneously displayed.
46. A system comprising:
- at least one receiving station controllable by at least one user of the system, the receiving station configured to be communicatively coupled through at least one network to at least one transmitting station that is physically separated from the receiving station, the receiving station including:
- a user interface, provided for execution and presentation at the receiving station, configured to enable the at least one user to interactively specify at least one medical imaging volume data set stored at the transmitting station, and to specify at least one volume data rendering method and at least one rendering parameter to be applied on a volume data set by the transmitting station to generate a volume data rendering result for transmitting from the transmitting station to the receiving station, wherein the user interface enables the at least one user to specify different data rendering requests resulting from different rendering parameters, different rendering methods, or different data sets from one or multiple data acquisition methods, and to specify integration of the different data rendering requests into at least one composite rendering result; and
- a display device, at the receiving station, to display the rendering result, wherein the display device displays the composite rendering result and a plurality of parameters used for generating the composite rendering result.
47. The system of claim 46, wherein the user interface provided for execution at the receiving station is provided at least in part within a web browser.
48. The system of claim 47, wherein software for operation at the receiving station is provided as a plug-in or client applet executing within the web browser.
49. The system of claim 46, wherein the receiving station is provided with software that is installed, managed and upgraded by the transmitting station, the software being supplied over the network.
50. The system of claim 46, wherein computation operations to create the rendering result are performed exclusively by the transmitting station.
51. A system comprising:
- at least one transmitting station operating as a server, the transmitting station comprising an image processor interactively controllable by at least one remote client receiving station over a network;
- wherein the image processor is configured to: generate and provide to the receiving station a partial or complete volume data rendering result in response to a volume data rendering request received by the transmitting station from the receiving station, the volume data rendering request specifying at least one volume data rendering method and at least one rendering parameter to be applied on a medical image volume data set by the transmitting station; and wherein the transmitting station is configured to: automatically adjust at least one configuration based on characteristics of the receiving station; locally receive new requests for volume data rendering interactively issued by a remote user of the receiving station based on feedback from the partial or complete volume data rendering result and the rendering parameters, the new requests including an adjusted volume data rendering method and adjusted rendering parameters, wherein the new requests for volume data rendering are generated from a user interface provided by the receiving station enabling specification of at least one of a plurality of different rendering parameters, at least one of a plurality of different rendering methods, and at least one of a plurality of different data sets from one or multiple data acquisition methods, and wherein the new requests respectively specify a method to integrate respective of the new requests into at least one composite rendering result; and locally generate an updated partial or complete volume data rendering result in real time by processing the specified medical imaging volume data set using the adjusted volume data rendering method and adjusted rendering parameters.
52. The system of claim 51, wherein the at least one transmitting station provides the remote client receiving station with software at each time of use or on an as-needed basis, the software being provided over the network.
53. The system of claim 51, the transmitting station being further configured to process the new requests for volume data rendering and re-generate the updated partial or complete volume data rendering result until a desired rendering result is achieved.
4222076 | September 9, 1980 | Knowlton et al. |
4475104 | October 2, 1984 | Shen |
4625289 | November 25, 1986 | Rockwood |
4737921 | April 12, 1988 | Goldwasser et al. |
4748511 | May 31, 1988 | Nicols et al. |
4910609 | March 20, 1990 | Nicholas et al. |
4961425 | October 9, 1990 | Kennedy et al. |
4985856 | January 15, 1991 | Kaufman et al. |
4987554 | January 22, 1991 | Kaufman |
5005126 | April 2, 1991 | Haskin |
5027110 | June 25, 1991 | Chang et al. |
5038302 | August 6, 1991 | Kaufman |
5101475 | March 31, 1992 | Kaufman et al. |
5235510 | August 10, 1993 | Yamada et al. |
5291401 | March 1, 1994 | Robinson |
5297034 | March 22, 1994 | Weinstein |
5321520 | June 14, 1994 | Inga et al. |
5339812 | August 23, 1994 | Hardy et al. |
5360971 | November 1, 1994 | Kaufman et al. |
5408249 | April 18, 1995 | Wharton et al. |
5432871 | July 11, 1995 | Novik |
5441047 | August 15, 1995 | David et al. |
5442733 | August 15, 1995 | Kaufman et al. |
5448686 | September 5, 1995 | Borrel et al. |
5469353 | November 21, 1995 | Pinsky et al. |
5482043 | January 9, 1996 | Zulauf |
5490221 | February 6, 1996 | Ransford et al. |
5497435 | March 5, 1996 | Beger |
5513101 | April 30, 1996 | Pinsky et al. |
5517021 | May 14, 1996 | Kaufman et al. |
5544283 | August 6, 1996 | Kaufman et al. |
5590271 | December 31, 1996 | Klinker |
5594842 | January 14, 1997 | Kaufman et al. |
5594935 | January 14, 1997 | Reber et al. |
5596994 | January 28, 1997 | Bro |
5600574 | February 4, 1997 | Reitan |
5603323 | February 18, 1997 | Pflugrath et al. |
5644645 | July 1, 1997 | Osuga |
5649173 | July 15, 1997 | Lentz |
5655084 | August 5, 1997 | Pinsky et al. |
5660176 | August 26, 1997 | Iliff |
5682328 | October 28, 1997 | Roeber et al. |
5715823 | February 10, 1998 | Wood et al. |
5730146 | March 24, 1998 | Itil et al. |
5740267 | April 14, 1998 | Echerer et al. |
5755577 | May 26, 1998 | Gillio |
5760781 | June 2, 1998 | Kaufman et al. |
5791908 | August 11, 1998 | Gillio |
5805118 | September 8, 1998 | Mishra et al. |
5836877 | November 17, 1998 | Zavislan |
5838906 | November 17, 1998 | Doyle et al. |
5847711 | December 8, 1998 | Kaufman et al. |
5882206 | March 16, 1999 | Gillio |
5883976 | March 16, 1999 | Ohsawa |
5903775 | May 11, 1999 | Murray |
5917929 | June 29, 1999 | Marshall et al. |
5941945 | August 24, 1999 | Aditham et al. |
5971767 | October 26, 1999 | Kaufman et al. |
5974446 | October 26, 1999 | Sonnenreich et al. |
5986662 | November 16, 1999 | Argiro et al. |
5987345 | November 16, 1999 | Engelmann et al. |
6008813 | December 28, 1999 | Lauer et al. |
6028608 | February 22, 2000 | Jenkins |
6070195 | May 30, 2000 | Yamamoto |
6088702 | July 11, 2000 | Plantz et al. |
6105055 | August 15, 2000 | Pizano et al. |
6166732 | December 26, 2000 | Mitchell et al. |
6195340 | February 27, 2001 | Hatayama |
6211884 | April 3, 2001 | Knittel et al. |
6219061 | April 17, 2001 | Lauer et al. |
6222551 | April 24, 2001 | Schneider et al. |
6230162 | May 8, 2001 | Kumar et al. |
6243098 | June 5, 2001 | Lauer et al. |
6253228 | June 26, 2001 | Ferris et al. |
6260021 | July 10, 2001 | Wong et al. |
6262740 | July 17, 2001 | Lauer et al. |
6266733 | July 24, 2001 | Knittel et al. |
6272470 | August 7, 2001 | Teshima |
6283322 | September 4, 2001 | Liff et al. |
6283761 | September 4, 2001 | Joao |
6289115 | September 11, 2001 | Takeo |
6293842 | September 25, 2001 | Belt |
6297799 | October 2, 2001 | Knittel et al. |
6310620 | October 30, 2001 | Lauer et al. |
6313841 | November 6, 2001 | Ogata et al. |
6331116 | December 18, 2001 | Kaufman et al. |
6342885 | January 29, 2002 | Knittel et al. |
6343936 | February 5, 2002 | Kaufman et al. |
6344861 | February 5, 2002 | Naughton et al. |
6356265 | March 12, 2002 | Knittel et al. |
6362620 | March 26, 2002 | Debbins et al. |
6369812 | April 9, 2002 | Iyriboz et al. |
6369816 | April 9, 2002 | Knittel et al. |
6381029 | April 30, 2002 | Tipirneni |
6404429 | June 11, 2002 | Knittel |
6407737 | June 18, 2002 | Zhao et al. |
6407743 | June 18, 2002 | Jones |
6411296 | June 25, 2002 | Knittel et al. |
6421057 | July 16, 2002 | Lauer et al. |
6424346 | July 23, 2002 | Correll et al. |
6426749 | July 30, 2002 | Knittel et al. |
6430625 | August 6, 2002 | Kley et al. |
6434572 | August 13, 2002 | Derzay et al. |
6476810 | November 5, 2002 | Simha et al. |
6483507 | November 19, 2002 | Osborne et al. |
6512517 | January 28, 2003 | Knittel et al. |
6514082 | February 4, 2003 | Kaufman et al. |
6532017 | March 11, 2003 | Knittel et al. |
6614447 | September 2, 2003 | Bhatia et al. |
6615264 | September 2, 2003 | Stoltz et al. |
6618751 | September 9, 2003 | Challenger et al. |
6621918 | September 16, 2003 | Hu et al. |
6654012 | November 25, 2003 | Lauer et al. |
6654785 | November 25, 2003 | Craig |
6674430 | January 6, 2004 | Kaufman et al. |
6680735 | January 20, 2004 | Seiler et al. |
6683933 | January 27, 2004 | Saito et al. |
6704024 | March 9, 2004 | Robotham et al. |
6760755 | July 6, 2004 | Brackett |
6807558 | October 19, 2004 | Hassett et al. |
6826297 | November 30, 2004 | Saito et al. |
6826669 | November 30, 2004 | Le et al. |
6847365 | January 25, 2005 | Miller et al. |
6847462 | January 25, 2005 | Kacyra et al. |
6879996 | April 12, 2005 | Laves |
6952741 | October 4, 2005 | Bartlett et al. |
7039723 | May 2, 2006 | Hu et al. |
7062714 | June 13, 2006 | Mo et al. |
RE42952 | November 22, 2011 | Hu et al. |
20010013128 | August 9, 2001 | Hagai et al. |
20010037402 | November 1, 2001 | Schneider |
20020005850 | January 17, 2002 | Osborne et al. |
20020065939 | May 30, 2002 | Liu |
20020069400 | June 6, 2002 | Miloushev et al. |
20030055896 | March 20, 2003 | Hu et al. |
20030086595 | May 8, 2003 | Hu et al. |
20030156745 | August 21, 2003 | Saito et al. |
0903694 | March 1999 | EP |
1001369 | May 2000 | EP |
1001375 | May 2000 | EP |
1001377 | May 2000 | EP |
1001379 | May 2000 | EP |
1001380 | May 2000 | EP |
1054347 | November 2000 | EP |
1054348 | November 2000 | EP |
1054349 | November 2000 | EP |
1054351 | November 2000 | EP |
1054353 | November 2000 | EP |
1054355 | November 2000 | EP |
1054356 | November 2000 | EP |
1054357 | November 2000 | EP |
1054358 | November 2000 | EP |
1054359 | November 2000 | EP |
1054383 | November 2000 | EP |
1054384 | November 2000 | EP |
1054385 | November 2000 | EP |
1069528 | January 2001 | EP |
1069530 | January 2001 | EP |
1069532 | January 2001 | EP |
1071041 | January 2001 | EP |
1081651 | March 2001 | EP |
1081652 | March 2001 | EP |
1081653 | March 2001 | EP |
1089225 | April 2001 | EP |
1089234 | April 2001 | EP |
1089235 | April 2001 | EP |
1093085 | April 2001 | EP |
1195717 | April 2002 | EP |
1195718 | April 2002 | EP |
1195719 | April 2002 | EP |
1195720 | April 2002 | EP |
1209618 | May 2002 | EP |
1209629 | May 2002 | EP |
11-239165 | August 1999 | JP |
2002-183746 | June 2002 | JP |
2002-183747 | June 2002 | JP |
WO-03021850 | March 2003 | WO |
WO-03021850 | March 2003 | WO |
WO-03041001 | May 2003 | WO |
- “2D and 3D Progressive Transmission Using Wavelets”, www.cs.wpi.edu/˜matt/courses/cs563/talks/Wavelet—Presentation/, (Mar. 25, 1997), 1-6.
- “A Brief Description of the Gibabit Testbed Initiative”, http://web.archive.org/web/19980703071107/http://www0.cnri.reston.va.us/overview/html, (archived Jul. 3, 1998), 9 pgs.
- “A Prototype Distributed Visualization System”, http://www.hpcc.arc.nasa.gov/reports/annrep97/ess/ww42.htm, (observed Oct. 5, 1999), 11 pgs.
- “Adding Data Visualization to Instrumentation”, (Advanced Visual Systems, Inc.) http://web.archive.org/web/1999042906060638/http://www.avs.com/solution/success/papers/testmea.htm, (archived Apr. 29, 1999), 7 pgs.
- “An Interactive Remote Visualization Environment for an Electromagnetic Scattering Simulation on a High Performance Computing System”, http://www.npac.syr.edu/users/gcheng/CEM/ems.html, (observed Jul. 14, 1999), 2 pgs.
- “U.S. Appl. No. 09/945,479, Non Final Office Action mailed Nov. 17, 2004”, 16 pgs.
- “U.S. Appl. No. 09/945,479, Notice of Allowance mailed Jun. 16, 2005”, 7 pgs.
- “U.S. Appl. No. 09/945,479, Notice of Allowance mailed Nov. 4, 2005”, 6 pgs.
- “U.S. Appl. No. 09/945,479, Response filed Mar. 17, 2005 to Non Final Office Action mailed Nov. 17, 2004”, 10 pgs.
- “U.S. Appl. No. 10/008,162, Final Office Action mailed Jan. 30, 2007”, 28 pgs.
- “U.S. Appl. No. 10/008,162, Final Office Action mailed Apr. 1, 2008”, 28 pgs.
- “U.S. Appl. No. 10/008,162, Final Office Action mailed Sep. 23, 2005”, 20 pgs.
- “U.S. Appl. No. 10/008,162, Non-Final Office Action mailed Feb. 24, 2005”, 15 pgs.
- “U.S. Appl. No. 10/008,162, Non-Final Office Action mailed May 3, 2006”, 22 pgs.
- “U.S. Appl. No. 10/008,162, Non-Final Office Action mailed Aug. 13, 2007”, 23 pgs.
- “U.S. Appl. No. 10/008,162, Response filed Feb. 6, 2006 Final Office Action mailed Sep. 23, 2005”, 18 pgs.
- “U.S. Appl. No. 10/008,162, Response filed Jun. 24, 2005 Non-Final Office Action mailed Feb. 24, 2005”, 12 pgs.
- “U.S. Appl. No. 10/008,162, Response filed Jul. 2, 2007 Final Office Action mailed Jan. 30, 2007”, 17 pgs.
- “U.S. Appl. No. 10/008,162, Response filed Nov. 2, 2006 to Non-Final Office Action mailed May 3, 2006”, 22 pgs.
- “U.S. Appl. No. 10/008,162, Response filed Dec. 13, 2007 to Non Final Office Action mailed Aug. 13, 2007”, 21 pgs.
- “U.S. Appl. No. 11/229,452, Final Office Action mailed Mar. 1, 2011”, 9 pgs.
- “U.S. Appl. No. 11/229,452, Final Office Action mailed Apr. 21, 2009”, 33 pgs.
- “U.S. Appl. No. 11/229,452, Final Office Action mailed Jun. 23, 2011”, 4 pgs.
- “U.S. Appl. No. 11/229,452, Notice of Allowance mailed Sep. 13, 2011”, 5 pgs.
- “U.S. Appl. No. 11/229,452, Response filed May 2, 2011 to Final Office Action mailed Mar. 1, 2011”, 10 pgs.
- “U.S. Appl. No. 11/229,452, Response filed Aug. 11, 2011 to Non Final Office Action mailed Jun. 23, 2011”, 14 pgs.
- “U.S. Appl. No. 11/229,452, Response filed Aug. 21, 2009 to Non Final Office Action mailed Apr. 21, 2009”, 11 pgs.
- “Argonne-USC Researchers Win GII Next Generation Award for Advanced Computing Infrastructure”, http://web.archive.org/web/19990204032736/http://www.npaci.edu/News/98/042298-gusto.html, (Apr. 22, 1998), 2 pgs.
- “Corridor One: An Integrated Distance Visualization Environment for SSI and ASCI Applications”, (Proposal to DOE 99-09) http://www-fp.mcs.anl.gov/fl/research/Proposals/co.htm, (observed Oct. 5, 1999), 29 pgs.
- “Demand for General Availability of Visualization Techniques”, http://www.ts.go.dlr.de/sm-sk—info/library/documents/EGSciVis97/VaWX5Fproto-3.html, (observed Jul. 15, 1999), 1 pg.
- “Department of Defense High-Performance Computing Modernization Office”, http://www.ncsa.uiuc.edu/Vis/PET/, (observed Jul. 15, 1999), 1 pg.
- “Distributed Visualization Task—1995 HPCC Annual Review Reports”, http://web.archive.org/web/19970607124635/http://olympic.jpl.nasa.gov/Reports/Highlights95/ML—DVT.html, (archived Jun. 7, 1997), 2 pgs.
- “DOT PET—Trends in Graphics and Visualization”, http://www.ncsa.uiuc.edu/Vis/Publications/trends.html, (Jan. 1998), 3 pgs.
- “DOD PET Strategic Plan for Visualization”, http://www.ncsa.uiuc.edu/Vis/PET/strategy.html, (observed Jul. 15, 1999), 1 pg.
- “DOD PET Visualization Plan, PET Initiatives”, http://www.ncsa.uiuc.edu/Vis/PET/timelineEfforts.html, (observed Jul. 15, 1999), 1 pg.
- “DOD PET Visualization Plan, Technology Trends”, http://www.ncsa.uiuc.edu/Vis/PET/timelineTrends.html, (observed Jul. 15, 1999), 1 pg.
- “DOD PET Visualization Plan, User Needs”, http://www.ncsa.uiuc.edu/Vis/PET/timelineNeeds.html, (observed Jul. 15, 1999), 1 pg.
- “EMERGE—Application Projects/Toolkits that will be deployed over EMERGE”, http://www.evl.uic.edu/cavern/EMERGE/applications.html, (observed Oct. 7, 1999), 2 pgs.
- “ERSUG Meeting Minutes”, http://home.nersc.gov/about/ERSUG/meeting—info/Apr98—minutes.html, (observed Jul. 15, 1999), 6 pgs.
- “Experiments in Remote Visualization”, http://woodall.ncsa.uiuc.edu/dbock/projects/RemoteViewIndex.html, (observed Jul. 14, 1999), 2 pgs.
- “Gibabit Testbeds Final Report”, http://web.archive.org/web/19980213052026/http://www.cnri.reston.va.us/gigafr/noframes/section-4-25.htm, (archived Feb. 13, 1998), 5 pgs.
- “H Innovation Trade Show Brochure”, (2001), 2 Pages.
- “High Performance Internet Access for Research and Education in Science and Engineering”, (The State University of New Jersey—Rutgers—Grant Proposal) http://ephesus.rutgers.edu/hypernet/origprop.net, (observed Jul. 15, 1999), 10 pgs.
- “HP Internet Philanthropic Initiative—1998 Update”, http://web.archive.org/web/19990922074930/http://www.infomed.dia.fi.upm.es/english/HP/proposal1998.html, (archived Sep. 22, 1999), 10 pgs.
- “Internet 2 Research Applications”, (University of Alabama at Birmingham) http://web.archive.org/web/19990302023111/http://www.uab.edu/internet2/meritapps1.html, (archived Mar. 2, 1999), 6 pgs.
- “Introduction—Data Visualization”, http://www.npac.syr.edu/users/gcheng/homepage/thesis/node27.html, (observed Jul. 15, 1999), 2 pgs.
- “NCSA Report, SIGGRAPH 98: Visualization Software”, http://www.ncsa.uiuc.edu/Vis/Publications/SIGGRAPH98/s10.html, (observed Jul. 15, 1999), 1 pg.
- “NCSA Vis&VE Trip Report: IEEE VR 99 (nee VRAIS99)”, http://www.ncsa.uiuc.edu/Vis/Trips/VR99.html, (observed Jul. 15, 1999), 12 pgs.
- “NCSA Visualization and Virtual Environments”, http://www.ncsa.uiuc.edu/Vis/, (observed Jul. 15, 1999), 2 pgs.
- “Next Generation Internet—1999 Program”, http://www.er.doe.gov/production/octr/mies/press99-09.html, (observed Oct. 12, 1999), 2 pgs.
- “Non-Final Office Action mailed Aug. 13, 2007 in U.S. Appl. No. 10/008,162”, OARN, 25.
- “Northeast Parallel Architectures Center—Projects”, http://www.npac.sry.edu/Projects/index.html, (observed Jul. 15, 1999), 4 pgs.
- “Northeast Parallel Architectures Center—Mission”, http://www.npac.syr.edu/Mission/index.html, (observed Oct. 12, 1999), 1 pg.
- “Notice Inviting Research Grant Applications”, Federal Register, vol. 64, No. 5, (Jan. 8, 1999), 3 pgs.
- “PACS—Picture Archiving and Communications Systems”, http://www.medfusion.com/Arena/PACS/pacs.html, (observed Oct. 12, 1999), 1 pg.
- “Progressive Image Transmission”, www.vision.ee.ethz.ch/˜rsia/talks/RSL—talk/pit—3.html, (observed Sep. 28, 2001), 1 pg.
- “Rationale for a WWW-Based Visualization Service”, http://www.ts.go.dlr.de/sm-sk—info/library/documents/EGSciVis97/VaWX5Fproto-2.html, (observed Jul. 15, 1999), 1 pg.
- “Scientific Visualization Sites (Mirror)”, http://puh.cb.uu.se/˜rogerh/visWeblets.html, (observed Oct. 5, 1999), 5 pgs.
- “TeraRecon's AquariusNET tm Server”, (2002), 7 pgs.
- “The Clinical Practice Guidelines Project”, http://www.infomed.dia.fi.upm.es/english/guidelines.html, (observed Oct. 7, 1999), 2 pgs.
- “The Realization Report: Issue No. 3”, http://www.itd.nrl.navy.mil/ONR/realization—report/rosenblum.003.html, (observed Jul. 15, 1999), 4 pgs.
- “Use of Remote Visualization Methods”, http://www.ts.go.dlr.de/sm-sk—info/library/documents/EGSciVis97/VaWX5Fproto-4.html, (observed Jul. 15, 1999), 1 pg.
- “Vol. III—Technical Proposal for Collaborative Interaction and Visualization”, (BAA 93-01-PKRD) http://www.npac.syr.edu/users/gcf/romelabciv/prop.html, (observed Oct. 12, 1999), 28 pgs.
- “Web-Based Visualization Server for 3D Reconstruction”, http://felix.uttgm.ro/˜dradoiu/ip/Laborator/application.html, (observed Oct. 12, 1999), 4 pgs.
- Ang, C. S., et al., “Integrated Control of Distributed-Volume Visualization Through the World-Wide-Web”, Proceedings of the IEEE Conference Visualization '94, (1994), 13-20.
- Bajaj, C. L., et al., “The VAIDAK Medical Image Model Reconstruction Toolkit”, Proceedings of the 8th SIGAPP Symposium on Applied Computing, (Abstract Only), (1993), 1 pg.
- Baker, M. P., et al., “Battleview: Touring a Virtual Battlefield”, http://www.ncsa.uiuc.edu/Vis/Publications/bv98.html, (observed Jul. 15, 19999), 5 pgs.
- Baker, M. P., et al., “Visualization of Damaged Structures”, http://www.ncsa.uiuc.edu/Vis/Publications/damage.html, (observed Jul. 15, 1999), 6 pgs.
- Bock, D., et al., “Collaborative Visualization”, http://www.ncsa.uiuc.edu/Vis/Publications/collabFramework.html, (observed Jul. 15, 1999), 6 pgs.
- Bock, D., “Remote Visualization Using the World Wide Web”, http://www.ncsa.uiuc.edu/Vis/Publications/remoteVisHTTP.html, (observed Jul. 15, 1999), 5 pgs.
- Bossart, P.-L., “Hypertools in Image and Volume Visualization”, Proceedings of the Fourth Annual Tcl.Tk Workshop, (Abstract Only), (1996), 1 pg.
- Casey, B., “HInnovation Adds Internet Wrinkle to 3-D Imaging”, http://www.auntminnie.com/index.asp?sec=rca&sub=def&pag=dis&ItemID=50700, (observed Deecember 11, 2002), 1 pg.
- Cavanagh, P. M., et al., “Commentary—Delivering Imaging to Primary Care in the Next Millennium”, The British Journal of Radiology, 71, (1998), 805-807.
- Chen, L. S., et al., “A Distributed and Interactive Three-Dimensional Medical Image System”, Computerized Medical Imaging and Graphics, 18(5), (1994), 325-337.
- Cimino, C., et al., “Clinical Applications of an ATM/Ethernet Network in Departments of Neuroradiology and Radiotherapy”, Stud Health Technol Inform., 43(Part B), (1997), 606-610.
- Coleman, J., et al., “TeleInViVo: A Collaborative Volume Visualization Application”, Stud Health Technol Inform. 39, (1997), 115-124.
- Cosic, D., “An Open Medical Imaging Workstation Architecture for Platform-Independent 3-D Medical Image Processing and Visualization”, IEEE Transactions on Information Technology in Biomedicine, 1(4), (1997), 279-83.
- Dobbins, H., et al., “Multisite Three-Dimensional Brain Visualization”, http://www.uab.edu/internet2/ala—tele-collaboration.html, (observed Oct. 12, 1999), 1 pg.
- Eichelberg, M., et al., “Retain: Multimedia Teleradiology on the Pan-European Information Superhighway”, CAR' 96 Computer Assisted Radiology. Proceedings of the International Symposium on Computer and Communication Systems for Image Guided Diagnosis and Therapy, (Abstract Only), (1996), 1 pg.
- Fernandez, I., et al., “ARMEDA: Accessing Remote Medical Databases Over the World Wide Web”, http://www.infomed.dia.fi.upm.es/Armeda/armeda—MIE97.html, (observed Oct. 7, 1999), 4 pgs.
- Furuie, S., et al., “Telemedicine: Remote Analysis and Quantification of Nuclear Medicine Images Using Java”, http://incor.usp.br/spdweb/projects/p22/p22.html, (observed Oct. 12, 1999), 2 pgs.
- Hendin, O., et al., “Medical Volume Rendering Over the WWW Using VRMP and JAVA”, Stud Health Technol Inform., 50, (1998), 34-40.
- Henri, C. J., et al., “Design and implementation of World Wide Web-based tools for image management in computed tomography, magnetic resonance imaging, and ultrasonography”, Journal of Digital Imaging, 10(3 Suppl 1), (1997), 77-79.
- Hightower, D., et al., “Computer Radiology: Ship to Shore”, CAR '96 Computer Assisted Radilogy. Proceedings of the International Symposium on Computer and Communication Systems for Image Guided Diagnosis and Therapy, (Abstract Only), (1996), 1 pg.
- Kim, N., et al., “Web Based 3-D Medical Image Visualization on the PC”, Proceedings of the 9th World Congress on Medical Informatics MEDINFO '98, 9 (Part 2), (1998), 1105-1110.
- Kindratenko, V., et al., “Sharing Virtual Environments Over a Transatlantic ATM Network in Support of Distant Collaboration in Vehicle Design”, http://www.ncsa.uiuc.edu/VEG/DVR/VE98/article.html, (observed Jul. 15, 1999), 8 pgs.
- Lee, J. S., et al., “Volumetric Visualization of Head and Neck CT Data for Treatment Planning”, International Journal of Radiology, Oncology, Biology, Physics, 44(3), (Abstract Only), (1999), 2 pgs.
- Leigh, J., et al., “LIMBO/VTK: A Tool for Rapid Tele-Immersive Visualization”, Proceedings of IEEE Visualization '98, (1998), 4 pgs.
- Liu, P-W, et al., “Distributed Computing: New Power for Scientific Visualization”, IEEE Computer Graphics and Applications, 16(30, (1996), 42-51.
- Lu, T., et al., “Compression Techniques in Teleradiology”, Proceedings of the SPIE (vol. 3808), Applications of Digital Image Processing XXII, (Abstract Only), (1999), 1 pg.
- Macedonia, M. R., et al., “A Transatlantic Research and Development Environment”, IEEE Computer Graphics and Applications, 17(2), (1997), 76-82.
- Makris, L., et al., “Teleworks: A CSCW Application for Remote Medical Diagnosis Support and Teleconsultation”, IEEE Transactions on Information Technology in Biomedicine, 2(2), (Jun. 1998), 62-73.
- Malassiotis, S., et al., “Coding and Visualization oof 3D Medical Data for Low Bitrate Communication”, CAR '96 Computer Assisted Radiology. Proceedings of the International Symposium on Computer and Communication Systems for Image Guided Diagnosis and Therapy, (Abstract Only), (1996), 1 pg.
- Markle, S., et al., “Distributed Visualization—How to improve the quality of 3D medical volume rendering at almost no costs”, Europacs, Oct. 1998.
- Marovic, Branko, et al., “Visualization of 3D fields and medical data and using VRML”, Future Generation Computer Systems, 14(1-2), (Jun. 1998), 33-49.
- Martin, D. C., et al., “Libraries in the Information Age”, http://web.archive.org/web/19990223220334/www.ckm.ucsf.edu/papers/LibraryInformationAge/, (archived Feb. 23, 1999), 5 pgs.
- Mercurio, P. J., et al., “The Distributed Laboratory: An Interactive Visualization Environment for Electron Microscope and 3D Imaging”, http://www.acm.org/pubs/toc/Abstracts/cacm/129891.html, (observed Jul. 15, 1999), 2 pgs.
- Mun, S. K., et al., “Health Care Using High-Bandwidth Communication to Overcome Distance and Time Barriers for the Department of Defense”, Proceedings of the SPIE (vol. 1785)—Enabling Technologies for High-Bandwidth Applications, (Abstract Only), (1993), 1 pg.
- Napel, S., “3D Displays for Computed Tomography”, In: Medical CT and Ultrasound: Current Technology and Applications, Published by Advanced Medical Publishing, (1995), 603-626.
- Norris, P. R., et al., “Reliable Remote Visualization of Real-Time Biomedical Data”, (Abstract Only), (1998), 1 pg.
- Orphanoudakis, S. C., et al., “Technological advances in teleradiology”, European Journal of Radiology, 22(3), (Jun. 1996), 205-217.
- Pandya, A. S., et al., “3D Reconstruction, Visualization, and Measurement of MRI Images”, Proceedings of the SPIE (vol. 3640)—Three-Dimensional Image Capture and Applications II, (Abstract Only), (1999), 1 pg.
- Pelizzari, S. A., et al., “Volumetric Visualization of Anatomy for Treatment Planning”, International Journal of Radiation, Oncology, Biology, Physics, 34(1), (Abstract Only), (1996), 2 pgs.
- Phalke, V., “Remote Visualization for Computational Simulations”, http://www.science.doe.gov/sbir/awards—abstracts/sbir/cycle16/phase1/024.htm, (observed Oct. 12, 1999), 1 pg.
- Prior, F., et al., “Communication Technology for Telemedicine”, Proceedings of the National Forum: Military Telemedicine On-line Today Research Practice and Opportunities, (1995), 1 pg.
- Rhee, T. H., et al., “An Effective Visualization Technique for Huge Volume Data”, Journal of KISS(A) (Computer Systems and Theory), (Abstract Only), (1997), 1 pg.
- Robb, R. A., et al., “Patient-Specific Anatomic Models from Three Dimensional Medical Image Data for Clinical Applications in Surgery and Endoscopy”, Journal of Digital Imaging, 10(3,1), (Abstract Only), (1997), 1 pg.
- Roush, W., “To Johnson, the Grid Means Access”, NAS News, (1999), 2 pgs.
- Ruggiero, C., “Teleradiology: A Review”, Journal of Telemedicine and Telecare, 4, (1998), 25-35.
- Sakamoto, Y., et al., “Three-Dimensional Segmentation of Magnetic Resonance Images Using Neural Network”, Proceedings of ACCV '95. Second Asian Conference on Computer Vision, vol. 1, (Abstract Only), (1995), 1 pg.
- Salomie, A., et al., “A Teleworking Tool With Progressive Transmission Capabilities for Medical Images”, CARS '99 Computer Assisted Radiology and Surgery. Proceedings of the 13th International Congress and Exhibition, (Abstract Only), (1999), 1 pg.
- Samothrakis, S., et al., “WWW Creates New Interactive 3D Graphics and Collaborative Environments for Medical Research and Education”, International Journal of Medical Informatics, 47(1-2), (1997), 69-73.
- Santarelli, M. F., et al., “A Parallel System for Dynamic 3D Medical Imaging”, Proceedings of High-Performance Computing and Networking. International Conference and Exhibition, (Abstract Only), (1997), 1 pg.
- Silverstein, et al., “Web-Based Segmentation and Display of Three-Dimensional Radiologic Image Data”, Stud Health Technol Inform., 50, (1998), 53-59.
- Smith, P. H., et al., “Data and Visualization Corridors: Report on the 1998 DVC Workshop Series”, Rosedale, California Institute of Technology Report CACR-164, (Sep. 1998), 54 Pages.
- Wilkinson, E. P., et al., “Remote-Rendered 3D CT Angiography (3DCTA) as an Intraoperative Aid in Cerebrovascular Neurosurgery”, Computer Aided Surgery, 4, (1999), 256-263.
- Wong, S. T. C., et al., “Interactive Query and Visualization of Medical Images on the World Wide Web”, http://web.archive.org/20000412115341/http://www.lri.ucsf.edu/polymap/paper/spie96.html (Archived Apr. 12, 2000), 11 pages.
- Yun, D. Y. Y., et al., “Sharing Computational Resources and Medical Images Via ACTS-Linked Networks”, [Online]. Retrieved from the Internet: <URL: http://web.archive.org/web/20020620160931/http://web.ptc.org/library/ptr/june96/2.html>, (Jun. 1996; archived Jun. 20, 2002), 12 pgs.
- Zuiderveld, K. J., et al., “Clinical Evaluation of Interactive Volume Visualization”, Proceedings of the 7th Conference on Visualization '96, (1996), 367-370.
- “U.S. Appl. No. 09/434,088, Non Final Office Action mailed Oct. 1, 2002”, 7 pgs.
- “U.S. Appl. No. 09/434,088, Notice of Allowance mailed May 2, 2003”, 8 pgs.
- “U.S. Appl. No. 09/434,088, Response filed Dec. 27, 2002 to Non Final Office Action mailed Oct. 1, 2002”, 12 pgs.
- “3D displays for computed tomography”, by Sandy Napel p. 603-626, in the book entitled “Medical CT and Ultrasound: current technology and applications” published by Advanced Medical Publishing, 1995.
- “Data and Visualization Corridors: Report on the 1998 DVC Workshop Series” by P.H. Smith & J. van Rosendale, California Institute of Technology Technical Report CACR—164 Sep. 1998.
Type: Grant
Filed: Nov 21, 2011
Date of Patent: Jul 2, 2013
Assignee: Vital Images, Inc. (Minnetonka, MN)
Inventors: Hui Hu (Seattle, WA), Jun Zhang (Shorewood, WI)
Primary Examiner: Yon Couso
Application Number: 13/301,600
International Classification: G06K 9/00 (20060101); G09B 23/28 (20060101);