INTRACARDIAC ECHOCARDIOGRAPHY IMAGE RECONSTRUCTION IN COMBINATION WITH POSITION TRACKING SYSTEM
A system and method to display a four-dimensional (4D) model of an imaged anatomy is provided. The system comprises a controller, and an imaging system including an imaging probe in communication with the controller. The imaging probe can acquire generally real-time, 3D image data relative to a direction of image acquisition along an imaging plane. The system also includes a tracking system in communication with the controller. The tracking system includes at least one tracking element integrated with the imaging probe. The system is operable to process the generally real-time, 3D image data acquired by the imaging probe relative to generally real-time tracking information acquired by the tracking system so as to display the 4D model of the imaged anatomy.
Latest General Electric Patents:
- SYSTEMS AND METHODS FOR PERFORMING MACHINE LEARNING AND DATA ANALYTICS IN HYBRID SYSTEMS
- ADDITIVE MANUFACTURING APPARATUSES INCLUDING GANTRY FOR DIRECTING COLLIMATED LASER BEAM
- PRINT HEADS FOR ADDITIVE MANUFACTURING APPARATUSES
- Non-invasive quantitative multilayer assessment method and resulting multilayer component
- Overlapping secondary coils in a wireless power reception apparatus
This application is a continuation application and claims priority to and the benefit of U.S. patent application Ser. No. 12/060,714, filed Apr. 1, 2008, and entitled “INTRACARDIAC ECHOCARDIOGRAPHY IMAGE RECONSTRUCTION IN COMBINATION WITH POSITION TRACKING SYSTEM”, which claims the benefit of U.S. Provisional Application No. 60/938,442, filed on May 16, 2007, and entitled “4D INTRACARDIAC ECHOCARDIOGRAPHY RECONSTRUCTION WITH ASSISTANCE OF EM POSITION TRACKER” all of which are hereby incorporated by reference as if fully set forth herein.
BACKGROUNDThe subject matter herein generally relates to medical imaging, and more specifically, to a system and method to navigate a tool through an imaged subject.
Image-guided surgery is a developing technology that generally provides a surgeon with a virtual roadmap into a patient's anatomy. This virtual roadmap allows the surgeon to reduce the size of entry or incision into the patient, which can minimize pain and trauma to the patient and result in shorter hospital stays. Examples of image-guided procedures include laparoscopic surgery, thoracoscopic surgery, endoscopic surgery, etc. Types of medical imaging systems, for example, computerized tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), ultrasound (US), radiological machines, etc., can be useful in providing static image guiding assistance to medical procedures. The above-described imaging systems can provide two-dimensional or three-dimensional images that can be displayed to provide a surgeon or clinician with an illustrative map to guide a tool (e.g., a catheter) through an area of interest of a patient's body.
When performing a medical procedure, it is desired to calibrate or align the acquired image data of the imaged subject with the tracked tool so as to navigate through the imaged subject. Yet, the sensors to the track the tool and the detectors to acquire the image data may not be precisely located due to manufacturing variation. One example of application of image-guided surgery is to perform an intervention procedure to treat cardiac disorders or arrhythmias. Heart rhythm disorders or cardiac arrhythmias are a major cause of mortality and morbidity. Atrial fibrillation is one of the most common sustained cardiac arrhythmia encountered in clinical practice. Cardiac electrophysiology has evolved into a clinical tool to diagnose these cardiac arrhythmias. As will be appreciated, during electrophysiological studies, probes, such as catheters, are positioned inside the anatomy, such as the heart, and electrical recordings are made from the different chambers of the heart.
A certain conventional image-guided surgery technique used in interventional procedures includes inserting a probe, such as an imaging catheter, into a vein, such as the femoral vein. The catheter is operable to acquire image data to monitor or treat the patient. Precise guidance of the imaging catheter from the point of entry and through the vascular structure of the patient to a desired anatomical location is progressively becoming more important. Current techniques typically employ fluoroscopic imaging to monitor and guide the imaging catheter within the vascular structure of the patient.
BRIEF SUMMARYA technical effect of the embodiments of the system and method described herein includes increasing the field of view of image data acquisition employed to generate three- or four-dimensional reconstruction of images to guide an interventional surgery procedure. In general, as a surgeon moves the medical instrument with respect to the patient's anatomy, virtual images of the instrument or object are displayed simultaneously relative to real-time acquired image data represented in the model of the patient's anatomy. Another technical effect of the system and method described herein of tracking includes readily tracking the spatial relationship of the medical instruments or objects traveling through an operating space of patient. Yet, another technical effect of the system and method described herein includes reducing manpower, expense, and time to perform interventional procedures, thereby reducing health risks associated with long-term exposure of the subject to radiation.
According to one embodiment, a system operable to generate a four-dimensional (4D) model of an imaged anatomy, the system comprising a controller and an imaging system including an imaging probe in communication with the controller. The 4D imaging probe is operable to acquire a real-time, 3D image data relative to a direction of image acquisition along an imaging plane. The system further includes a tracking system in communication with the controller. The tracking system includes at least one tracking element integrated with the 4D ultrasound imaging probe. The system is operable to process the real-time, 3D image data acquired by the imaging probe relative to generally real-time tracking information acquired by the tracking system to generate a 4D model of the imaged anatomy.
According to another embodiment, a method of image acquisition of an imaged anatomy is provided. The method comprises the steps of acquiring a series of partial view 3D image data with a 4D imaging probe defined by an image coordinate system and a time reference; tracking a position of the 4D imaging probe relative to the time reference and a tracking coordinate system; generating a 4D model of the imaged anatomy by merging the series of partial view 3D image data defined relative to the time reference; and displaying the 4D model in superposition with a representation of the tracked position of the imaging probe.
Systems and methods of varying scope are described herein. In addition to the aspects of the subject matter described in this summary, further aspects of the subject matter will become apparent by reference to the drawings and with reference to the detailed description that follows.
In the following detailed description, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific embodiments, which may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the embodiments, and it is to be understood that other embodiments may be utilized and that logical, mechanical, electrical and other changes may be made without departing from the scope of the embodiments. The following detailed description is, therefore, not to be taken in a limiting sense.
An embodiment of the system 100 generally includes an image acquisition system 115, a steering system 120, a tracking system 125, an ablation system 130, and an electrophysiology system 132 (e.g., a cardiac monitor, respiratory monitor, pulse monitor, etc. or combination thereof), and a controller or workstation 134.
Still referring to
An embodiment of the image acquisition system 115 includes a real-time, intracardiac echocardiography (ICE) imaging system 140 that employs ultrasound to acquire image data of the patient's anatomy and to merge acquired image data to generate a three-dimensional model of the patient's anatomy relative to time, generally herein referred to as a four-dimensional (4D) model or image. In accordance with another embodiment, the image acquisition system 115 is operable to fuse or combine acquired image data using above-described ICE imaging system 140 with pre-acquired or intra-operative image data or image models (e.g., two- or three-dimensional reconstructed image models) generated by another type of supplemental imaging system 142 (e.g., CT, MRI, PET, ultrasound, fluoroscopy, x-ray, etc. or combinations thereof).
According to the illustrated embodiment in
An embodiment of the catheter housing 170 generally encloses the transducer array 150, the micromotor 155, the drive shaft 160, and the interconnect 165. The catheter housing 170 may further enclose the motor control 175 (illustrated in dashed line). The catheter housing is generally of a material, size, and shape adaptable to internal imaging applications and insertion into regions of interest of the imaged subject 110. At least a portion of the catheter housing 170 that intersects the ultrasound imaging volume or scanning direction is comprised of acoustically transparent (e.g., low attenuation and scattering, acoustic impedance near that of the blood and tissue (Z-1.5M Rayl)) material. An embodiment of the space between the transducer array 150 and the housing 170 is filled with acoustic coupling fluid (e.g., water) having an acoustic impedance and sound velocity near those of blood and tissue (e.g., Z˜1.5M Rayl, V˜1540 m/sec).
An embodiment of the transducer array 150 is a 64-element one-dimensional array having 0.110 mm azimuth pitch, 2.5 mm elevation, and 6.5 MHz center frequency. The elements of the transducer array 150 are electronically phased in order to acquire a sector image generally parallel to a longitudinal axis 180 of the catheter housing 170. In operation, the micromotor 155 mechanically rotates the transducer array 150 about the longitudinal axis 180. The rotating transducer array 150 captures a plurality of two-dimensional images for transmission to the ICE imaging system 140 (shown in
Still referring to
Referring back to
Referring to
As illustrated in
For sake of example and referring to
The tracking elements 185, 190, 195, 200 generally enable a surgeon to continually track the position and orientation of the catheters 105 or 184 (See
For example, tracking elements 185 and 190 can include EM field generators attached to the subject 110 and operable to generate an EM field, and assume that tracking element 195 or 200 includes an EM sensor or array operable in combination with the EM generators 185 and 190 to generate tracking data of the tracking elements 185, 190 attached to the patient 110 relative to the microsensor 195 or 200 in real-time (e.g., continuously). According to one embodiment of the series of tracking elements 185, 190, 195, 200, one is an EM field receiver and a remainder are EM field generators. The EM field receiver may include an array having at least one coil or at least one coil pair and electronics for digitizing magnetic field measurements detected by the receiver array. It should, however, be understood that according to alternate embodiments, the number and combination of EM field receivers and EM field generators can vary.
The field measurements generated or tracked by the tracking elements 185, 190, 195, 200 can be used to calculate the position and orientation of one another and attached instruments (e.g., catheters 105 or 184 (See
Referring now to
Alternatively, the tracking elements 185, 190, or 200 can include a plurality of coils (e.g., Hemholtz coils) operable to generate a magnetic gradient field to be detected by the receiver 195 of the tracking system 125 and which defines an orientation of the ICE catheter 105. An embodiment of the receiver 195 includes at least one conductive loop operable to generate an electric signal indicative of spatial relation and orientation relative to the magnetic field generated by the tracking elements 185, 190 and 200.
Referring back to
Still referring to
An embodiment of the controller or workstation computer 134 can be generally connected in communication with and controls the image acquisition system 115 (e.g., the ICE imaging system 140 or supplemental imaging system 142), the steering system 120, the tracking system 125, the ablation system 130, and the electrophysiology system 132 so as to enable each to be in synchronization with one another and to enable the data acquired therefrom to produce or generate a full-view 3D or 4D ICE model of the imaged anatomy.
An embodiment of the controller 134 includes a processor 220 in communication with a memory 225. The processor 220 can be arranged independent of or integrated with the memory 225. Although the processor 220 and memory 225 are described located at the controller 134, it should be understood that the processor 220 or memory 225 or portion thereof can be located at the image acquisition system 115, the steering system 120, the tracking system 125, the ablation system 130 or the electrophysiology system 132 or combination thereof.
The processor 220 is generally operable to execute the program instructions representative of acts or steps described herein and stored in the memory 225. The processor 220 can also be capable of receiving input data or information or communicating output data. Examples of the processor 220 can include a central processing unit of a desktop computer, a microprocessor, a microcontroller, or programmable logic controller (PLC), or the like or combination thereof.
An embodiment of the memory 225 generally comprises one or more computer-readable media operable to store a plurality of computer-readable program instructions for execution by the processor 220. The memory 225 can also be operable to store data generated or received by the controller 134. By way of example, such media may comprise RAM, ROM, PROM, EPROM, EEPROM, Flash, CD-ROM, DVD, or other known computer-readable media or combinations thereof which can be used to carry or store desired program code in the form of instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a machine or remote computer, the remote computer properly views the connection as a computer-readable medium. Thus, any such a connection is properly termed a computer-readable medium.
As shown in
Having provided a description of the general construction of the system 100, the following is a description of a method 300 (see
Referring now to
The controller 134 via the tracking system 125 is operable to track movement of the ICE catheter 105 in accordance with known mathematical algorithms programmed as program instructions of software for execution by the processor 220 of the controller 134 or by the tracking system 125. An exemplary navigation software is INSTATRAK® as manufactured by the GENERAL ELECTRIC® Corporation, NAVIVISION® as manufactured by SIEMENS®, and BRAINLAB®.
Referring back to
Referring to
An embodiment of the registering and/or calibrating step 335 includes the step of rigidly attaching at least one dynamic reference microsensor or tracking element 185, 190, 195 or 200 (See
The dynamic reference microsensor 185, 190, 195, or 200 establishes a so-called world coordinate system (world reference frame—dynamic reference microsensor) (wcs) 340 (See
The embodiment of the method 300 further includes a step 345 of tracking (e.g., via the tracking system) a position or location of the at least one catheter 105 or 184 relative to the acquired image data. According to one embodiment of the method 300, at least one instrument catheter 105 or 184 is integrated with a plurality of hybrid electromagnetic position microsensors 185, 190, 195, 200 and ultrasonic markers 202. The electromagnetic microsensors 185, 190, 195, 200 and ultrasonic markers 202 can both be located and rigidly mounted on the at least one instrument catheter 105 or 184. A computer image-processing program is operable to detect and mark positions of the ultrasonic markers 202 relative to the generated 3D or 4D ICE image model.
The controller 134 can be generally operable to align positions of the ultrasonic markers 202 with the tracking coordinate reference frame or coordinate system 325. This registration information may be used for the alignment (calibration) between the tracking reference frame or coordinate system 325 and the ultrasonic marker reference frame or coordinate system 332 relative to the ICE imaging reference frame or coordinate system 320. This information may also be used for detecting the presence of electromagnetic distortion or tracking inaccuracy.
An embodiment of the method 300 further includes a step 355 of acquiring image data (e.g., scan) of the anatomy of interest of the imaged subject 110. An embodiment of the step of acquiring image data includes generating a series of partial-views 358 of 3D or 4D image data from real-time image data acquired while rotating the ICE catheter 105 around the longitudinal axis 180 (See
deflect, advance, or retract the catheter, in addition to simple rotation.
An embodiment of the image acquisition step 355 includes calculating positions or degree of rotation of the ICE catheter 105 about the longitudinal axis 180. The image acquisition step 355 can further include synchronizing or gating a sequence of image acquisition relative to tracking data acquired by the hybrid tracking system 125 (e.g., tracking a location (e.g., position and/or orientation) relative to the acquired image data). In addition, the image acquisition step 355 can further include synchronizing or gating a sequence of image acquisition relative to measuring cardiac and respiratory signals by the electrophysiology system 132.
According to one embodiment of the image acquisition step 355, the ablation catheter 184 can be detected or is visible in the acquired image data by the ICE imaging system 140. By “scribbling” the anatomical surface of the anatomy of interest with the at least one instrument catheter 184 relative to acquired tracking data of the location of the catheters 105 or 184, the anatomical boundary may be enhanced to result in a more accurate surface model for image registration and surgical planning.
Referring to
An embodiment of the method 300 can further include a step 370 of acquiring or measuring location data of the ultrasonic markers 202 described above by detecting or identifying the voxels illustrative thereof in the acquired, real-time 3D or 4D ultrasound image data via the ICE imaging system 140. An embodiment of the ultrasonic markers 202 can be configured to identify each of a series of tools or catheters 105 or 184 delivered through an imaged subject 110. An embodiment of the pattern of the tracking elements 185, 190, 195, 200 and/or ultrasonic markers 202 may be uniquely defined for different types of instrument catheters 105 or 184 for identification purposes. Dependent on the uniquely defined tracking elements 185, 190, 195, 200 and/or ultrasonic markers 202, the image acquisition system 115, tracking system 125 or controller 134 or combination thereof can be operable to uniquely identify location and orientation of each of the tools or catheters 105 and 184. An embodiment of the system 100 is operable to extract the location of voxels from the acquired image data correlated to the imaging of the ultrasonic markers 202. In this way, the location of the ultrasonic markers 202 may be tracked with respect to the ICE catheter 105 or ablation catheter 184, or vice versa.
An embodiment of the system 100 includes a software having image processing programs operable to extract the locations of the ultrasonic markers 202 from the acquired generally real-time, 3D or 4D ultrasound image data (e.g., partial views 358), an electromagnetic distortion detection program using information from the 3D or 4D ultrasound image data, and the tracking program with instructions to execute steps of the hybrid tracking technique described above. According to one embodiment, the system 100 processes acquired 3D or 4D ICE image data to extract voxel positions of the ultrasonic markers 202 relative to the ablation catheter 184. The system 100 also processes the acquired 3D or 4D ultrasound image data to generate a surface model of the imaged anatomy. The system 100 is also operable to calculate the vector 181 generally representative of a central direction of a field of view of the ICE imaging system 140.
Referring to
According to one embodiment, the system 100 automatically conducts a 4D scan of the anatomy of interest of the imaged subject 110. The controller 134 can calculate or estimate a number of the ICE scans needed to generate the full-view 4D model reconstruction. Based on the field of view (FOV) of the ultrasound transducer array 150 and a tracked starting position of the ICE catheter 105, the system 100 is operable to calculate a set of orientations (e.g. T(mcs.p1->scs)T(scs->wcs), T(mcs.p2->scs)T(scs->wcs), T(mcs.pn->scs)T(scs->wcs) where p1, p2, and pn are different catheter orientations) of the ultrasound imaging plane 181 to conduct a full-view 4D scan in the dynamic reference sensor frame 340. The controller 134 can also communicate signals representative of instructions to the steering system 130 that direct automatic maneuvering and rotating of the ICE catheter 105 to a series of imaging positions, e.g., T(mcs.p1->scs)T(scs->wcs), T(mcs.p2->scs)T(scs->wcs), . . . , and T(mcs.pn->scs)T(scs->wcs).
According to another embodiment, the ICE catheter 105 of the ICE imaging system 140 executes the full-view 3D or 4D ICE scan of the imaged anatomy according to received input instructions directed to manually drive the ICE catheter 105 into a series of imaging positions, as well as received input instructions directed to manually activate each event of image acquisition. Referring to
The 3D or 4D ICE image and catheter position acquisitions can be triggered at the preset cardiac and respiratory phase, e.g. t1, t2, . . . , tn. At a given catheter orientation (pi), the system 100 is operable to acquire and transform a series of ultrasound images relative to the world coordinate frame 340, represented by [T(ice.pi->scs)T(scs->wcs)].t1, [T(ice.pi->scs)T(scs->wcs)].t2, . . . , and [T(mcs.pi->scs)T(ice->wcs)].tn.
Alternatively, the 3D or 4D ICE image acquisition may be conducted at a dynamic or variable rate optimized according to the imaged volume, desired ultrasound image quality, etc. With each acquired ultrasound image volume (or plane or beam), the system 100 records a current cardiac and respiratory phase (ti), and the current catheter or image position (pi).
Upon the completion of the full-view 3D or 4D scan, the system 100 can reconstruct the generated series of partial views 358 of 3D or 4D ultrasound image data at different catheter orientations and different cardiac cycle time or phase. By transforming or registering the partial views 358 of the acquired 3D or 4D ICE image data relative to the world coordinate frame 340 (see
To generate the full-view 3D or 4D ICE model 362, an embodiment of the system 100 can group the partial views 358 of 3D or 4D ultrasound image data according the cardiac timing sequence, e.g. [T(ice.p1->scs)T(scs->wcs)].t1, [T(ice.p2->scs)T(scs->wcs)].t1, . . . , and [T(ice.pn->scs)T(scs->wcs)].t1 at cardiac phase t1. A number of image processing techniques such as image smoothing, filtering, or averaging can be used to merge the series of partial views 358 to a full-view 3D or 4D ICE model 362 [T(ice.3D->wcs)].t1 for the t1 cardiac phase or the respiratory phase.
The controller 134 is operable to repeat the above-described image reconstruction process to create a full-view 3D or 4D ICE model of the anatomic structure, denoted as [T(ice.3D->wcs)].t1, [T(ice.3D->wcs)].t2, . . . , and [T(ice.3D->wcs)].tn, for the rest of the cardiac phases or respiratory phases.
According to one embodiment of the system 100 and method 300 described herein, the controller 134 can control operation of the steering system 120, the tracking system 125, the ablation system 130, and the electrophysiology monitoring system 132, the ICE imaging system 140 and/or any supplemental imaging system 142. Via the controller 134, the system 100 is operable to process the acquired image data relative to the acquired real-time tracking information from the hybrid tracking system 125 and the cardiac and respiratory cycle information from the electrophysiology system 132. The system 100 is further operable to generate full-view 3D or 4D ICE model of the imaged anatomy, register the acquired partial views 358 of the real-time 3D or 4D ICE image data with the generated full-view 3D or 4D model 362 or other pre-operative or intra-operative real-time non-ICE images 375 (e.g., MRI, CT, PET, etc.), and control the steering system 120 in maneuvering the ICE catheter 105 or ablation catheter 184 relative to the direction of the 3D or 4D ICE imaging plane 181 (or vice versa) (See
Referring to
A technical effect of the embodiments of the system 100 and method 300 described above is to provide an image reconstruction algorithm that provides a full-view 4D image model of anatomic structure, fast registration of the acquired partial views 358 of the 3D or 4D ICE image data relative to other preoperative and intraoperative images 375, capability to create the surgical plan that comprises graphic representations of historical locations, current locations, and future locations of image acquisition 372, 373, 374 (See
Another technical effect of the above-described system 100 and method 300 described above is an ability to register the 3D or 4D ICE imaging system 140 with the tracking system 125 or another type or supplemental imaging system 142 via execution of computer-readable program instructions stored and executable at the controller 134. As described above, the controller 134 is operable to perform registration of the coordinate systems 320, 325, 330, 332, 340 relative to one another.
Another technical effect of the system 100 and method 300 described above is an ability to combine image data and models generated by the ICE imaging system 140 with a location of the ICE catheter 105 or ablation catheter 184 being tracked by tracking system 125, all in combination with imaged data or models generated by another imaging system 142, with an ability to compensate for deficiencies in the imaged data acquired with the ICE imaging system 140. Accordingly, the system 100 and method 300 enhance tracking and guidance of the position and orientation of the catheter 105 or transducer array 150 navigating through the imaged subject 110. The system 100 and method 300 also synchronize tracking and guidance of movement and orientation of the ICE catheter 105 or ablation catheter 184 associated with the ablation system 130, with each other as well as with electrophysiological signals (e.g., respiratory cycle, cardiac cycle, etc.) as tracked by the electrophysiological system(s) 132.
Technical effects of integrating the 4D ICE imaging system 140 with the tracking system 125 includes, inter alia, enhancement of the field of the view of the 4D ICE imaging catheter 105, acceleration of the 4D ICE registration process with other pre-operative and intra-operative images, and enhancement of pre-operative surgical planning and intraoperative instrument catheter guidance.
Embodiments of the subject matter described herein include method steps which can be implemented in one embodiment by a program product including machine-executable instructions, such as program code, for example in the form of program modules executed by machines in networked environments. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Machine-executable instructions, associated data structures, and program modules represent examples of computer program code for executing steps of the methods disclosed herein. The particular sequence of such computer- or processor-executable instructions or associated data structures represent examples of corresponding acts for implementing the functions described in such steps.
Embodiments of the subject matter described herein may be practiced in a networked environment using logical connections to one or more remote computers having processors. Logical connections may include a local area network (LAN) and a wide area network (WAN) that are presented here by way of example and not limitation. Such networking environments are commonplace in office-wide or enterprise-wide computer networks, intranets and the Internet and may use a wide variety of different communication protocols. Those skilled in the art will appreciate that such network computing environments will typically encompass many types of computer system configurations, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. Embodiments of the subject matter described herein may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination of hardwired or wireless links) through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
This written description uses examples to disclose the subject matter, including the best mode, and also to enable any person skilled in the art to make and use the subject matter described herein. Accordingly, the foregoing description has been presented for purposes of illustration and description, and is not intended to be exhaustive or to limit the subject matter to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of the subject matter described herein. The patentable scope of the subject matter is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.
Claims
1. A system operable to display a four-dimensional (4D) model of an imaged anatomy, comprising: wherein the system is operable to process the real-time, 3D image data acquired by the imaging probe relative to generally real-time tracking information acquired by the tracking system so as to display a 4D model of the imaged anatomy.
- a controller;
- an imaging system including an imaging probe in communication with the controller, the imaging probe operable to acquire generally real-time, three-dimensional (3D) image data relative to a direction of image acquisition along an imaging plane;
- and a tracking system in communication with the controller, the tracking system including at least one tracking element integrated with the imaging probe,
2. The system of claim 1, wherein the imaging probe includes a 4D imaging catheter operable to acquire a series of partial view, generally real-time, 3D ultrasound image data of the imaged anatomy, wherein the imaging system is further operable to generate the 4D model of the imaged anatomy from the acquired 3D ultrasound image data, and wherein the system registers newly acquired real-time, 3D ultrasonic image data so as to display in superposition relative to the generated 4D model.
3. The system of claim 2, further comprising:
- an electrophysiology system in communication with the controller, the electrophysiology system operable to acquire a cardiac cycle and a respiratory cycle information in synchronization with a time of acquisition of image data by the 4D imaging system; and
- a display that simultaneously includes illustration of: the newly acquired, generally real-time, 3D ultrasound image data superimposed relative to the 4D model, the position of the 4D imaging catheter, and a cardiac and a respiratory cycle information illustrated at a time of acquisition in synchronization relative to the time of acquisition of the newly acquired, generally real-time, 3D ultrasonic image data.
4. The system of claim 3, wherein the tracking system includes a dynamic reference comprising tracking element located at the imaged anatomy.
5. The system of claim 4, wherein a spatial relation and an orientation of the generally real-time, 3D ultrasound image data is defined by an image coordinate system referenced in predetermined spatial relation and orientation relative to the 4D imaging catheter, wherein the spatial relation of the at least one tracking element integrated at the 4D imaging catheter relative to the dynamic reference is defined by a tracking coordinate system.
6. The system of claim 5, wherein the generally real-time 3D ultrasound image data acquired by the 4D imaging catheter is comprised of a series of partial-view 3D ultrasound image data acquired while rotating the imaging catheter about a longitudinal axis that extends through a center of the 4D imaging catheter.
7. The system of claim 5, further comprising an ablation system having an ablation catheter in communication with the controller, the ablation catheter and the 4D imaging catheter both maneuvered by a steering system, wherein maneuvering of the 4D imaging catheter and the ablation catheter by the steering system is defined relative to a mechanical coordinate system registered in relation to the image coordinate system and the tracking coordinate system.
8. The system of claim 3, wherein the controller calculates a degree of rotation of a motor to drive movement of the 4D imaging catheter about the longitudinal axis in synchronization relative to time of image acquisition of 3D ultrasound image data, as well as relative to tracking data acquired by the tracking system and acquisition of the cardiac and respiratory cycle information acquired by the electrophysiology system.
9. The system of claim 3, wherein the controller merges the acquired, generally real-time partial 3D views of image data according to a sequence of the cardiac or respiratory cycle information to generate the 4D model.
10. The system of claim 2, wherein the controller is operable to calculate a number of image acquisition scans performed by the 4D imaging catheter about the longitudinal axis estimated to generate the 4D model of the imaged anatomy.
11. A method of image acquisition of an imaged anatomy, the method comprising the steps of:
- acquiring a series of partial view 3D image data with a 4D imaging probe, defined by an image coordinate system and a time reference;
- tracking a position of the 4D imaging probe relative to the time reference and a tracking coordinate system;
- generating a 4D model of the imaged anatomy by merging the series of partial view 3D image data defined by the position of the 4D imaging probe relative the tracking coordinate system and relative to the time reference; and
- displaying the 4D model.
12. The method of claim 11, further comprising the step of:
- steering movement of the imaging probe and an ablation catheter relative to the tracking data acquired by the tracking system and relative to the 4D model synchronized in generally real-time relative to the time reference.
13. The method of claim 12, wherein the step of displaying includes superposing the 4D model with one of a pre-operative or intra-operative image data not acquired with the 4D imaging probe.
14. The method of claim 11, wherein the imaging probe includes a 4D imaging catheter operable to acquire a series of generally real-time, partial view, 3D ultrasound image data, wherein the step of displaying includes simultaneously illustrating the acquired generally real-time, 3D ultrasound image data superimposed relative to the 4D model and the position of the 4D imaging catheter.
15. The method of claim 14, further comprising the step of registering the acquired series of partial-view 3D ultrasonic image data, the 4D model, and the tracked position of the 4D imaging catheter relative to a dynamic reference sensor located at the imaged anatomy.
16. The method of claim 14, wherein the step of acquiring the series of generally real-time, partial-view 3D ultrasound image data includes rotating the 4D imaging catheter about a longitudinal axis that extends through a center of the 4D imaging catheter.
17. The method of claim 16, wherein the acquiring step includes calculating a degree of rotation for a motor to drive movement of the 4D imaging catheter about the longitudinal axis.
18. The method of claim 17, the method further comprising the step of calculating a number of image acquisition scans performed while rotating the 4D imaging catheter about the longitudinal axis so as to generate the 4D model of the imaged anatomy.
Type: Application
Filed: Apr 17, 2013
Publication Date: Sep 5, 2013
Applicant: GENERAL ELECTRIC COMPANY (SCHENECTADY, NY)
Inventors: DUN ALEX LI (SALEM, NH), CHRISTOPHER A. NAFIS (REXFORD, NY), DOUGLAS G. WILDES (BALLSTON LAKE, NY), VERNON T. JENSEN (DRAPER, UT)
Application Number: 13/864,482
International Classification: A61B 8/00 (20060101);