TISSUE IMAGING SYSTEM AND METHOD FOR TISSUE IMAGING
The tissue imaging system, incorporates an integrated X-Ray and Ultrasound Imaging system. The X-Ray system is used for imaging and generating a 3D Volume of the extracted tissue sample (specimen). The Ultrasound imaging system is used for imaging the body area of interest of the patient from which the tissue sample is to be extracted. On the display of the system, the physician draws contours on the area of interest and the same is displayed as a 3D Volume. A 3D volume of the extracted tissue is generated using the x-ray imaging sub-system. Quantitative analysis is performed on the two 3D volumes (extracted tumor x-ray imaging and the contoured ultrasound imaging) to determine the difference between the contoured and the surgically extracted specimen to assist the physician in determining if further surgical intervention is required.
Latest BEST MEDICAL INTERNATIONAL, INC. Patents:
- Probe and system and method for detecting radiation and magnetic activity from body tissue
- Multi-purpose balloon catheter for intra cavity radiation delivery
- Image guided surgical methodology and system employing patient movement detection and correction
- Dual double balloon catheter
- Multi-purpose balloon catheter for intra cavity radiation delivery
This application claims priority under 35 U.S.C. § 119 to U.S. Provisional Patent Application No. 62/490,021, filed on Apr. 25, 2017, hereby incorporated herein by reference in its entirety.
FIELD OF THE INVENTIONThe invention generally relates to the field of medical devices, and more particularly to tissue imaging devices, systems and methods utilizing both ultrasound imaging and X-ray imaging in the removal of cancerous, necrotic or other diseased tissue from a body, such as from a human body, animal body or reptilian body, and generating three dimensional (3D) ultrasound images and 3D X-ray images of tissue prior to and after its removal from the body to determine whether the desired amount or volume of the cancerous, necrotic or other diseased tissue has been removed from the body.
BACKGROUNDImaging devices, such as using ultrasound imaging or X-ray imaging, are commonly used in diagnosis for treatment of various medical issues, such as injury to bones and detecting cancerous tissue in a body. One such use, for example, is the use of imaging type diagnostic tools in the detection of breast or prostate cancer, for example. Clearly, early detection of cancerous tissue and accurately removing such diseased tissue from a body can increase the likelihood of preventing the cancer from spreading or metastasizing throughout the body, as well as can minimize the need for additional surgical procedures involving excision o cancerous, necrotic or other diseased tissue.
The excising of cancerous, necrotic or other diseased tissue can, in some instances, be relatively complex, particularly to assure that the appropriate amount or volume of cancerous, necrotic or other diseased tissue, as typically also includes an amount of adjacent healthy tissue, is removed as an added precaution to minimize the re-occurrence or spreading in the body of the cancerous, necrotic or other diseased tissue. Pre-surgical diagnostics can employ X-ray imaging, for example, to determine the size, volume, and the location in the body of the cancerous, necrotic or other diseased tissue. The location of the tissue to be removed is appropriately marked, such as by a suitable marker or guidewire. Once the cancerous, necrotic or other diseased tissue is removed from the body, X-ray imaging, for example, typically can be employed to determine if the outer areas of the removed tissue do not include the cancer, or necrosis or other diseased tissue. Whereupon if it is determined from analysis of the diagnostic imaging of the removed tissue additional tissue needs to be removed, the above described procedure of tissue removal and diagnostic examination of the removed tissue is repeated, as necessary, until it is determined that the amount and volume of removed tissue is a correct amount.
X-ray imaging of removed tissue typically provides two dimensional (2D) images, with the removed tissue being rotated to obtain differing images thereof for the tissue analysis. However, traditional two dimensional (2D) X-ray imaging may be disadvantageous as typically involving a relatively long time period for analysis, as well as can likely increase the likelihood of inaccurate measurements, such as from movement of the tissue sample during X-ray imaging, and considering that it is typically performed under time constraints in relation to being performed during the time the person or animal is undergoing the surgical removal procedure. Another tissue analysis technique employed can involve digital tomosynthesis which utilizes digital image capture and processing with a detector motion providing images similar to conventional tomography, but typically the images can have a limited field depth, but can possibly save time for sample analysis in that image slices at various depths and thickness can be provided.
Considering the advantages and limitations of X-ray and digital tomosynthesis systems, a cabinet specimen tomosynthesis system has been proposed for specimen imaging in U.S. Pat. No. 9,138,193 to Lowe et al. issued on Sep. 22. 2015, entitled “Specimen Radiography With Tomosynthesis In A Cabinet” (the “'193 Patent”), incorporated by reference herein in its entirety. In the '193 Patent, X-ray imaging and digital tomosynthesis are utilized in a method and system for producing tomosynthesis images of a breast specimen in a cabinet x-ray system is disclosed.
In the '193 Patent system, it is disclosed that an X-ray source delivers X-rays through a specimen of excised tissue and forms an image at a digital X-ray detector. Multiple X-ray images are taken as the X-ray source moves relative to the stationary breast specimen. It is disclosed that, desirably, the X-ray source moves in a range from about 350 degrees to and including about 10 degrees. The X-ray source can travel substantially along a path that generally defines an arc, or linearly, while the digital X-ray detector remains stationary throughout and the source remains substantially equidistant from the specimen platform. The '193 Patent further discloses that the set of X-ray image data taken at the different points are combined to form a tomosynthesis image that can be viewed in different formats, alone or as an adjunct to conventional specimen radiography. The '193 Patent further discloses reconstructing three-dimensional tomosynthetic x-ray images from two-dimensional projection x-ray images in real-time and on-demand.
While the cabinet specimen tomosythesis system disclosed in the '193 patent appears to offer advantages in specimen analysis, the disclosed '193 patent specimen analysis appears to rely primarily on human judgment in reading the images of the removed tissue to determine whether an appropriate amount of diseased tissue and adjacent non-diseased marginal tissue has been removed. Since the '193 Patent system appears to rely on human judgment in the determination of the sufficiency of tissue removed, and which judgment appears to be based only on the reconstructed images of the excised tissue, the likelihood of error can potentially still be affected by one or more of the human reviewer and/or the quality or accuracy of the reconstructed image of the excised tissue. Further, it appears the '193 patent system image analysis does not provide a system and method for an image comparison of the excised tissue with that of the tissue to be removed as present in the body.
Further, ultrasound imaging has been employed in various medical applications to provide ultrasound images of various body portions and bodily tissue as, for example, in U.S. Pat. No. 6,961,405 to Scherch issued on Nov. 1, 2005 entitled “Method And Apparatus For Target Position Verification” (the “'405 Patent”), incorporated by reference herein in its entirety.
In the '405 patent, a system and method is disclosed for aligning the position of a target within a body of a patient to a predetermined position used in the development of a radiation treatment plan. The apparatus includes an ultrasound probe used for generating live ultrasound images, a position sensing system for indicating the position of the ultrasound probe with respect to the radiation therapy device, and a computer system The '405 patent apparatus discloses an ultrasound probe that generates two-dimensional ultrasound images of the portion of a patient's body containing the target, while patient is on a treatment table. The computer system is used to display the live ultrasound images of a target in association with representations of the radiation treatment plan, to align the displayed representations of the radiation treatment plan with the displayed live ultrasound images, to capture and store at least two two-dimensional ultrasound images of the target overlaid with the aligned representations of the treatment plan data, and to determine the difference between the location of the target in the ultrasound images and the location of the target in the representations of the radiation treatment plan.
Thus, the '405 patent system advantageously provides two-dimensional ultrasound images of the portion of a patient's body to align the displayed representations of the radiation treatment plan with the displayed live ultrasound images, so as to determine the difference between the location of the target in the ultrasound images and the location of the target in the representations of the radiation treatment plan. However, it appears the '405 patent system and method likewise does not specifically disclose a tissue imaging system and method for an image comparison of excised tissue from a body with that of the tissue to be removed as present in the body.
Also, various systems have been employed to track the position and orientation of an object in medical applications. In this regard, knowledge of the position of a surgical tool during neurosurgery or location of a target such as a tumor while radiation therapy treatment is occurring, have always been important considerations. The position of an object or tool is typically defined by three translation parameters (x, y, z) and three rotation parameters (pitch, roll, yaw) corresponding to six degrees of freedom. The translation parameters (x, y, z) indicate three-dimensional position, e.g. forward and back (y-axis), left and right (x-axis), up and down (z-axis), and three rotation parameters (pitch, roll, yaw) indicate orientation of the tool or object, e.g. rotation about the x-axis (pitch), rotation about the y-axis (roll), and rotation about to the z-axis (yaw).
Various systems are known for determining the spatial position and orientation of an object. One such system includes use of a mechanical arm to track the location of a medical tool or probe which can be used to further determine the location of a target. In order to locate the target, the tool or probe can be affixed to the mechanical arm having a known reference position. A computer system tracks the tool or probe while an operator repositions the tool or probe along with the mechanical arm. The geometry of the mechanical aim is known such that movement of the tool or probe in conjunction with the mechanical arm provides the computer system continuous position information regarding the tool or probe. In an invasive procedure, the tool or probe can have a fixed length. Thus, contacting the target with the end of the tool can provide a position location of the target. Ultrasound has also been employed to track the position and orientation of an object in medical applications. In a noninvasive procedure, a probe, such as an ultrasound device, can be used to locate both the position and the orientation of the target.
Also, both active and passive emission techniques are known which operate by projecting a geometric representation or extension of the object or tool formed by the emitters onto the field of view of a pair of spaced sensors. Various implementations of sensors have been used, one being the use of two cameras positioned spaced apart a known distance and angled in the general direction of the object or tool such that the three-dimensional position of the object or tool can be obtained by triangulation from the positions of the emitters. For example, a camera or opti-electrical motion measurement system, known as the Polaris®, by Northern Digital Inc., Ontario Canada, has been used for triangulating the position of optically trackable tools.
Specifically, a computer system, using mathematical processing, can determine the three dimensional coordinates of each one of the emitters associated with the object or tool. The position of each of the emitters can be used to determine the position of the object or tool relative to a three dimensional coordinate system centered at a preselected point in space, typically at a point fixed relative to the sensors. The positional relationship to each other of each of the emitters associated with the object or tool can be utilized to further determine the orientation in space of the object or tool. Generally, at least three of the emitters must be detected and must be unobscured by any adjacent emitters. Additionally, the sensors generally require the emitters to be a minimum distance, for example, 3-5 cm apart. In theory, such systems can provide three unobstructed emitters for most of a sphere created by the six degrees of freedom. One of the more modern types of passive emission system utilizes passive retro-reflectors which can be affixed to the object or tool and which reflect directly back to a pair of active emitter arrays adjacent a pair of optical sensors. This type of system allows the optical sensors to be positioned relatively close together.
For example, a system to track the position and orientation of an object, such as for medical applications in medical applications is described in in U.S. Pat. No. 7,289,227 to Smetak et al. issued on Oct. 30, 2007 entitled “System And Tracker For Tracking An Object, and Related Methods” (the “'227 Patent”), incorporated by reference herein in its entirety.
In the '227 patent, there is disclosed a system to track a three-dimensional position and an orientation of a movable object and associated methods are provided. The system includes a tracker having an optically trackable body, adapted to connect to the movable object. A plurality of optical indicators are connected or mounted to the optically trackable body to forma a plurality of geometric figures. A plurality of obfuscating flanges optically separate the optical indicators from each other to prevent each of the optical indicators from becoming optically coincident with another optical indicator when viewed along a preselected viewing path. The system also includes an apparatus to track the tracker having an optical detector to simultaneously detect the three-dimensional position of at least three of the plurality of optical indicators and a determiner to determine the three-dimensional position and orientation of the optically trackable body from the position of the optical indicators.
However, it appears the '227 patent system and method, as well as the above referred to systems that have been employed to track the position and orientation of an object, such as in medical applications, likewise do not specifically disclose a tissue imaging system and method for an image comparison of excised tissue from a body with that of the tissue to be removed as present in the body.
It is believed that there is a need for a tissue imaging device, system and method that can enhance the accuracy and efficiency, as well as can assist in reducing the possibility of human error, in determining whether and appropriate amount of cancerous, necrotic or other diseased tissue has been removed from a body, such as of a person, animal or reptile. Also, in view of the foregoing, there is believed to be a need to provide a compact and versatile tissue imaging system as can readily provide assistance, such as to a human or veterinary surgeon or doctor, in real time, such as during a surgical procedure for the tissue removal to enhance the confidence level that a correct amount of the cancerous, necrotic or other diseased tissue has been removed from a body.
Further there is believed to be a need, as well as it would be advantageous to have, an integrated cabinet type, tissue imaging system for specimen imaging that provides generated 3D images of the removed and to be removed tissue and compare the generated 3D images of the removed and to be removed tissue to facilitate accurately indicating, such as to a surgeon in an operating room, that that the cancerous, necrotic or other diseased tissue, as well as an appropriate margin of heathy tissue around the cancerous, necrotic or other diseased tissue has been excised in an expedient manner and to enhance the probability of successful removal, while reducing the likelihood of human error.
Further, a tissue imaging device, system and method is believed to be needed that utilizes and combines the benefits of ultrasound imaging of bodily tissue and the benefits of X-ray imaging of specimens in the removal of cancerous, necrotic or other diseased tissue from a body, such as from a human body, animal body or reptilian body, so as to provide generated three dimensional (3D) ultrasound images of the tissue to be removed present in the body for comparison with generated 3D X-ray images of the removed tissue to increase the accuracy of the representations of the imaged tissue to facilitate the accuracy and ease of comparison in the determination of whether the desired amount or volume of the cancerous, necrotic or other diseased tissue has been removed from the body.
Thus, a compact, efficient, accurate and integrated tissue imaging device, system and method utilizing both ultrasound imaging and X-ray imaging in the removal of cancerous, necrotic outer diseased tissue from a body to facilitate determining whether an appropriate or correct amount or volume of cancerous, necrotic other diseased tissue has been removed from the body addressing, the aforementioned problems or shortcomings is desired.
SUMMARY OF INVENTIONAn embodiment of a tissue imaging system includes an x-ray imaging subsystem and an ultrasound imaging subsystem. The x-ray imaging subsystem includes an x-ray tube, a digital flat panel, a specimen tray, x-ray beam generator & controller, x-ray angle controller, and x-ray analysis processor and memory to store x-ray images. The ultrasound imaging subsystem includes an ultrasound probe assembly (ultrasound probe attached to a holder with 3D infrared markers), a 3D infrared imaging camera, probe 3D position and orientation detector, ultrasound beamformer, and the ultrasound analysis processor and memory to store ultrasound images. The system also includes a computer having an image fusion processor and memory to perform ultrasound and x-ray 3D volume fusion and store results of fusion analysis. The computer also has a user interface device (monitor) to display results and control the system.
The method includes having the patient lie on the operating room (treatment) table, providing an ultrasound probe to perform ultrasound imaging of the area of interest and displaying the live ultrasound image. More specifically, as the ultrasound probe is moved and rotated, tracking the 3D position and orientation of the ultrasound probe and displaying the information on the user interface.
The method also includes the drawing of contours on the ultrasound images to mark the tissue structure to be extracted. These contours are drawn on the ultrasound images acquired at different probe orientations and positions. Advantageously, the ultrasound analysis processor in the computer generates a 3D volume of the ultrasound and the contoured tissue structure and provides measurement information that includes the height, width, length, and volume of the contoured tissue.
The method also includes extracting the tissue structure/specimen and placing it onto the specimen tray of the x-ray subsystem and performing x-ray imaging. More specifically, performing the x-ray imaging includes rotating the x-ray tube and the fiat-panel detector around the stationary specimen in the specimen tray and acquiring multiple x-ray images and generating a 3D volume of the specimen from x-ray images. The x-ray analysis processor in the computer system provides measurement information that includes the height, width, length, and volume of the extracted specimen.
The method also includes fusion of the 3D volumes of the tissue contoured on the ultrasound image and the extracted tissue imaged via the x-ray subsystem by the fusion processor in the computer system. More specifically, overlaying one volume over the other for visual representation of the difference and performing quantitative analysis by determining/calculating the difference in volume, length, width and height of the two volumes, The visualization of 3D volumes overlaid on top of each other and the determined differences has the advantage of providing quantitative feedback on the error in specimen extraction facilitates preventing subjectivity in deciding on the amount of tissue to be extracted.
The method also includes performing biopsies under ultrasound guidance. The method enables marking of biopsy needles visualized in a 2D live ultrasound image and visualizing in a 3D ultrasound volume to provide an accurate representation of the angle and the position of the biopsy needle with respect to the structure being biopsied. In addition, the method provides the capability of visualizing all the biopsies performed on the tissue by displaying the needle paths of all biopsies in 2D and 3D ultrasound visualized on the user interface.
These and other features of the present invention will become readily apparent upon further review of the following specification and drawings.
Unless otherwise indicated, similar reference characters denote corresponding features consistently throughout the attached drawings.
Referring now to
Referring to
Referring to
As shown in
Referring to the system components 300 of
The controller/processor 312 and other processors of the system components 300 can be associated with, or incorporated into, any suitable type of computing device, for example, a personal computer or a programmable logic controller (PLC) or an application specific integrated circuit (ASIC). The interface display 152, the processor components of the system components 300 and the memories of the system components 300, and any associated computer readable media are in communication with one another by any suitable type of data bus or other wired or wireless communication, as is well known in the art.
Examples of computer readable media include a magnetic recording apparatus, non-transitory computer readable storage memory, an optical disk, a magneto-optical disk., and/or a semiconductor memory (for example, RAM, ROM, etc.). Examples of magnetic recording apparatus that may be used in addition to memories of the system components 300, or in place of memories of the system components 300, include a hard disk device (HDD), a flexible disk (FD), and a magnetic tape (MT). Examples of the optical disk include a DVD (Digital Versatile Disc), a DVD-RAM, a CD-ROM (Compact Disc-Read Only Memory), and a CD-R (Recordable)/RW.
Referring now to
As illustrated in
Also, the ultrasound probe 142 can include an ultrasonic sensor adapted to locate a three-dimensional position of an area of interest of a body, such as tumor tissue, with respect to a position of the ultrasound probe 142. Desirably, the ultrasound probe 142 includes an ultrasound image generator, such as typically available in a commercially available ultrasound probe such as, for example, PR14 Linear Probe 10-17 MHZ, Sonoscanner, Paris, France. The ultrasound probe 142 can generate two-dimensional ultrasound images of the portion of the patient P containing the area of interest within the body. Such as a person or animal, is on the 243 treatment table
The infrared 3D Imaging Camera 130 typically includes a body 132 having a wired or wireless communication with ultrasound sub system 340 in the main body 110 of the imaging system 100. The infrared 3D Imaging Camera 130 has an optical detector body 132 positioned separate and spaced apart from the optically trackable body 523 at a predetermined three-dimensional sensor reference location. The infrared 3D Imaging Camera 130 desirably includes a pair, of separate and spaced apart optical receivers 133, 135, connected to the optical detector body 132, each having a field of view and being adapted to receive optical energy emitted or reflected by each of the plurality of optical retro-reflective spheres 525 when positioned in the field of view typically (centered about the optical receiver pointing angle). The optical receivers 133, 135 detect the three-dimensional sphere position of each of the plurality of retro-reflective spheres 525 when positioned simultaneously within the field of view of both of the optical receivers 133, 135 to produce a plurality of position signals representing the position of such three-dimensional retro-reflective spheres 525. Each of the optical receivers 133, 135, can include a photo-sensitive array (not shown) such as a two-dimensional array charge coupled device CCD sensor or other, similar device, defining a photosensor, to detect optical energy radiated from the retro-reflective spheres 525 when positioned in the field of view of the optical receiver 133, 135. The photosensor provides electrical signals representative of positional information of the retro-reflective spheres 525. Each of the optical receivers 133, 135, also generally includes a lens for focusing the optical energy from the retro-reflective spheres 525 on the photosensor. Also, the infrared 3D imaging Camera 130 can be any of various suitable cameras, as are well-known to those skilled in the art, as, for example, a camera or opti-electrical motion measurement system known as the Polaris®, by Northern Digital Inc., Ontario Canada.
Where the plurality of indicators are in the form of the optical retro-reflective spheres 525, the Infrared 3D imaging Camera 130 can include a pair of infrared illuminators, desirably in the form of a pair of directional infrared illuminator (arrays) 137, 139. The first illuminator 137 is positioned in a surrounding relationship adjacent optical receiver 133 and the second illuminator 139 is positioned adjacent the other optical receiver 135 to selectively illuminate each of the plurality of optical retro-reflective spheres 525 when positioned in the field of view of the respective adjacent optical receiver 133, 135. This provides the requisite optical energy necessary to view the optical retro-reflective spheres 525 within the field of view of the respective adjacent optical receiver 133, 135.
The ultrasound probe 3D position and orientation detector 341 of the ultrasound subsystem 340 is in communication with the infrared 3D Imaging Camera 130 and is responsive to the plurality of position signals produced by the infrared 3D imaging Camera 130 to determine the three-dimensional retro-reflective sphere 525 position of each of the plurality of retro-reflective spheres 525 when positioned simultaneously within the field of view of both of the optical receivers (133, 135) of the infrared 3D Imaging Camera 130. The ultrasound sub-system 340 including ultrasound probe 3D position and orientation detector 341 in conjunction with the controller processor 312 analyzes the two-dimensional position of each sphere 525 in the field of view of both receivers 133, 135, with respect to the position on the photosensor, to determine the three-dimensional location of each sphere 525 simultaneously in the field of view of the receivers 133, 135. The main memory 314 accessible by the controller processor 312 to store a table of definitions containing the segment lengths “S” (
The Infrared 3D Imaging Camera 130 can be used to view the reflective spheres 525 positioned in its field of view, and thus, view at least one of the plurality of geometric figures (
Responsive to the segment lengths S (
The ultrasound sub-system 340 including ultrasound probe 3D position and orientation detector 341 can analyze the position and orientation of the identified geometric figure F in the field of view of the Infrared 3D Imaging Camera 130 which can then be used to determine the position and orientation of the ultrasound probe 142. Specifically, responsive to position signals produced by the Infrared 3D Imaging Camera 130 regarding the retro-reflective spheres 525 in the field of view of the Infrared 3D Imaging Camera 130 and segment lengths S previously stored of a table of definitions stored in the main memory 314, or other associated memory in the components 300, the ultrasound sub-system 340 including ultrasound probe 3D position and orientation detector 341 can determine the three-dimensional position and the orientation (viewing angle) of the ultrasound probe 142.
By continuously analyzing the position and orientation of the geometric figures F, the position and orientation of the ultrasound probe 142 can be continuously re-determined while the geometric figures F remain in the field of view of the Infrared 3D Imaging Camera 130. The position and orientation of the ultrasound probe 142 can be continuously tracked through various rotations of the ultrasound probe 142 by obfuscating a first of the plurality of retro-reflective spheres 525 as it leaves the field of view of the Infrared 3D Imaging Camera 130 to prevent the, first of the plurality of retro-reflective spheres 525 from becoming optically coincident with a second of the plurality of retro-reflective spheres 525. This can allow ultrasound sub-system 340 including ultrasound probe 3D position and orientation detector 341, upon determining one of the plurality of geometric figures F is exiting view, to thereby replace the one of the plurality of geometric figures F positioned in the field of view of the Infrared 3D Imaging Camera 130 with another one of the plurality of geometric figures F positioned in the field of view of the Infrared 3D Imaging Camera 130. This second of the plurality of figures F can then continuously be tracked until replaced with a third of the plurality of figures F to provide Continuous tracking, for example.
Referring now to
As mentioned, a plurality of optical indicators, such as optically retro-reflective spheres 525, are connected or mounted to the optically trackable body 523 to form a plurality of desirably dissimilar geometric figures F, such as, for example, as can be seen from
The optically trackable body 523 correspondingly includes a plurality of separate and spaced apart indicator mounts 729 (
The three-dimensional location of the retro-reflective spheres 525 and the orientation of each of the geometric figures can provide three-dimensional positional information and orientation information of the optically trackable body 523, and thus, the ultrasound probe 142. Desirably, the geometric figures F are readily distinguishable by the Infrared 3D imaging Camera 130. The plurality of retro-reflective spheres 525 can be positioned such that by the time one of the geometric figures F is no longer visible to the Infrared 3D Imaging Camera 130, another of the plurality of geometric figures F becomes visible to the Infrared 3D Imaging Camera 130. The position and orientation of each identified geometric figure F directly translates to that of the optically trackable body 523, and thus, the ultrasound probe 142.
Although the plurality of indicators can take the form of other locatable indicators, the optical retro-reflective spheres 525 are desirable as they advantageously can negate the requirement for supplying the ultrasound probe with tracker system 140 with electric power or illumination such as that required by indicators in the form of light emitting diodes or fiber optics, for example. Advantageously, this can also reduce the weight and complication of the ultrasound probe with tracker system 140 and can help prevent the ultrasound probe with tracker system 140 from interfering with an operator or the patient P during use of the imaging system 100. Further, the optical retro-reflective spheres 525 are desirable due to their wide field of view which allows detection at a wide range of viewing angles, as can exceed 180 degrees, for example. This can allow for a smaller trackable body 523 with less required spheres 525. Also, associated with the ultrasound probe is a movable object mount 565 as can assist in positioning the ultrasound probe 142. A movable object mounting connector 573 is communicatively associated with the movable object mount 565 for engaging the ultrasound probe 142 with the trackable body 523, for example.
Referring now to
As illustrated in
The optically trackable body 523 also includes separating means for optically separating each of the plurality of optical retro-reflective spheres 525 from each other to prevent each of the plurality of retro-reflective spheres 525 from becoming optically coincident with another one of the plurality of retro-reflective spheres 525 when viewed along a viewing path extending directly through either adjacent pair combination of the plurality of retro-reflective spheres 525. This separating means can serve to enhance optical detection of the plurality of retro-reflective spheres 525 to thereby further enhance determination of the positional location and orientation of the optically trackable body 523, and thus, the ultrasound probe 142.
The separating means or separating, members can, include various forms known and understood by those skilled in the art, but are desirably in the form of a plurality of variously shaped and positioned obfuscators including various forms of flanges, projections, separators, attachments, or other types of obstacles can be positioned between a pair of retro-reflective spheres 525. For example, the optically trackable body 523 can include a plurality of longitudinal medial body portion obfuscating flanges 651 sized and positioned substantially parallel to and spaced apart from the longitudinal axis of the medial body portion of the optically trackable body 523. The plurality of medial body portion obfuscating flanges 651 are of sufficient longitudinal length and radial width to optically separate each retro-reflective sphere 525 of the plurality of retro-reflective spheres 525 mounted to the medial body portion of the optically trackable body 523 from each adjacent retro-reflective sphere 525 of the plurality of retro-reflective spheres 525 also mounted to the medial body portion of the optically trackable body 523. This can significantly reduce a possibility of a retro-reflective sphere 525 of the plurality of retro-reflective, spheres 525 mounted to the medial body portion of the optically trackable body 523 from becoming optically coincident with an adjacent retro-reflective sphere 525 of the plurality of retro-reflective spheres 525 also mounted to the medial body portion of the optically trackable body 523, when viewed along a preselected (collinear) viewing path, for example.
Also, the medial body portion obfuscating flanges 651 can be of various geometric designs as long as they are radially short enough so that when observed or viewed such that a reference retro-reflective sphere 525 on the medial body portion of the optically trackable body 523 is “pointing” directly at an observer (e.g. the infrared 3D Imaging Camera 130;
Also, the optically trackable body 523 can also include a desirably annular medial body portion obfuscating flange 755 positioned substantially axially parallel with the longitudinal axis of the medial body portion of the optically trackable body 523, such as illustrated in
Because the adjacent spheres 525 are effectively prevented from becoming visually coincident with each other of the spheres 525, and thus, prevented from visually interacting with each other with respect to an outside observer, the spheres 525 forming the various different geometric figures are viewable by the Infrared 3D imaging Camera 130 such that the infrared 3D Imaging Camera 130 should generally not find any of the spheres 525 unusable due to coincidence with any of the other spheres 525 in the determination of which of the various different geometric figures F is in the field of view of the Infrared 3D Imaging Camera 130. However, more than one different geometric figure F can be in the field of view, although normally only one would be selected, for example.
Also, the optically trackable body 523 can also include an interior mount recess inwardly extending from the proximal body end portion 641 into the medial body portion as can be used to connect the optically trackable body 523 to the ultrasound probe 142, as described (
The interface display 150 and its display components during ultrasound imaging of the tissue of interest, such as an organ of interest, is shown in
Once the tissue specimen SP (for example: a tumor) is extracted from the patient P, and imaged in the x-ray chamber 120, the comparison of the 3D volume image of the extracted specimen SP and the 3D ultrasound based contoured structure 1113 are displayed for comparison as shown in
Referencing
As an example of an embodiment of a process for imaging a specimen SP using the imaging system 100 is illustrated by an exemplary process workflow 1600, such as for performing a Lumpectomy, and its analysis is schematically represented in flowchart of the process in
Referring to
As evident from the foregoing, the imaging system 100 can be used for a plurality of various treatments, procedures, or other therapeutic purposes, such as, for example, for prostate imaging and biopsy, as well as can be used for tissue extraction from a body. Also, the imaging system 100 can be used in addition to humans, can also be used for various research and medical application for animals, reptiles and other organisms, for example.
It is to be understood that the present invention is not limited to the embodiments described above, but encompasses any and all embodiments within the scope of the following claims.
Claims
1. A tissue imaging system, comprising:
- an x-ray subsystem comprising an x-ray chamber, an x-ray beam generator and, controller, an x-ray angle controller and an x-ray analysis processor and memory the x-ray chamber including an x-ray tube, a digital flat panel detector and a specimen tray, wherein the x-ray analysis processor and memory includes a two dimensional (2D) and a three dimensional (3D) x-ray image generator and a processor for measurement and analysis of the x-ray images;
- an ultrasound subsystem including an ultrasound probe with tracker system, an infrared 3D imaging camera, an ultrasound probe 3D position and orientation detector, an ultrasound beamformer, and an ultrasound analysis processor and memory, the ultrasound analysis processor and memory including an ultrasound 2D image generator, an ultrasound position and angle calculator; and an ultrasound 3D volume, generator including a processor for measurement and analysis; and
- a processor controller system including an image fusion generator including a fusion processor and memory to fuse a 3D ultrasound volume image and a 3D x-ray volume image of a specimen for measurement and analysis.
2. The tissue imaging system according to claim 1, wherein:
- the x-ray chamber rotates 360° around its axis, and
- the x-ray tube and the digital flat panel detector rotate 360° around the stationary specimen tray inside the x-ray chamber.
3. The tissue imaging system according to claim 1, wherein:
- the ultrasound subsystem creates a 3D volume of the ultrasound image and contoured structures of an area of interest based on the ultrasound image, ultrasound probe orientation and the ultrasound probe position.
4. The tissue imaging system according to claim 3, wherein:
- the Image Fusion Generator fuses the 3D volume of the contoured structure of the area of interest and the 3D volume of the specimen.
5. The tissue imaging system according to claim 4, wherein:
- the x-ray subsystem generates the 3D volume of the specimen placed in, the specimen tray of the x-ray chamber.
6. The tissue imaging system according to claim 5, wherein:
- the Image Fusion Generator fuses the 3D volume of the contoured structure of the area of interest and the 3D volume of the specimen and generates a 3D volume illustrating a combined 3D volume of the contoured structure of the area of interest and the 3D volume of the specimen to provide measurements relating to a volume, a height, a length and a width of the contoured structure and the extracted specimen.
7. The tissue imaging system according to claim 6, wherein:
- a difference in a 3D volume, height, length and width between the generated combined 3D volume of the specimen and the 3D volume of the contoured structure, from the generated combined 3D volume formed by, the fused 3D ultrasound volume image and the 3D x-ray volume image are determined for quantitative analysis by the same tissue imaging system.
8. A tissue imaging system, comprising:
- an x-ray subsystem to generate a two dimensional (2D) x-ray image and a three dimensional (3D) x-ray volume image of a specimen for measurement and analysis of the x-ray images;
- an ultrasound subsystem including an ultrasound probe and an imaging camera to determine a position and orientation of a contoured structure in an area of interest that includes the specimen and to generate an ultrasound 2D image of the contoured structure in an area of interest and an ultrasound 3D volume image of the contoured structure for measurement and analysis; and
- an image fusion generator including a fusion processor to fuse the ultrasound 3D volume image of the contoured structure and the 3D x-ray volume image of the specimen for measurement and analysis.
9. The tissue imaging system according to claim 8, wherein the x-ray subsystem comprises:
- an x-ray chamber, an x-ray tube and a digital flat panel detector, wherein x-ray chamber rotates 360° around its axis, and the x-ray tube and the digital flat panel detector rotate 360° around a stationary specimen tray inside the x-ray chamber.
10. The tissue imaging system according to claim 9, wherein;
- the x-ray subsystem generates the 3D x-ray volume image of the specimen placed in the specimen tray of the x-ray chamber.
11. The tissue imaging system according to claim 8, wherein:
- the ultrasound subsystem creates the ultrasound 3D volume image of the contoured structure based on the generated ultrasound 2D image.
12. The tissue imaging system according to claim 8, wherein:
- the image fusion generator fuses the ultrasound 3D volume image of the contoured structure with the 3D x-ray volume image of the specimen to create a combined 3D volume image.
13. The tissue imaging system according to claim 12, wherein:
- the combined 3D volume image generated by the image fusion generator provides measurements relating to a volume, a height, a length and a width of the contoured structure and the specimen.
14. The tissue imaging system according to claim 13, wherein:
- a difference in 3D volume, the height, the length, and the width between the ultrasound 3D volume image of the contoured structure and the 3D x-ray volume image of the specimen are determined from the combined 3D volume image for quantitative analysis by the tissue imaging system.
15. A method for tissue imaging in a tissue imaging system, comprising the steps of:
- generating a two dimensional (2D) x-ray image and a three dimensional (3D) x-ray volume image of a specimen for measurement and analysis of the x-ray images;
- generating an ultrasound 2D image of a contoured structure in an area of interest that includes the specimen and an ultrasound 3D volume image of the contoured structure, for measurement and analysis;, and
- combining the ultrasound 3D volume image of the contoured structure and the 3D x-ray volume image of the specimen to provide a combined 3D volume image for measurement, and analysis.
16. The method for tissue imaging in a tissue imaging system according to claim 15, further comprising, the step of:
- determining a volume, a height, a length and a width of the contoured structure and the specimen from the combined 3D volume image.
17. The method for tissue imaging in a tissue imagine system according to claim 16, further comprising the step of
- determining a difference in 3D volume, the height, the length, and the width between the ultrasound 3D volume image of the contoured, structure and the 3D x-ray volume image of the specimen for quantitative analysis by the tissue imaging system.
18. A method for ultrasound imaging in a biopsy or a seed localization procedure, ccomprising the steps of:
- generating a two dimensional (2D) live ultrasound image of a tissue of interest;
- generating a three dimensional (3D) ultrasound volume image from the live 2D ultrasound image of the tissue of interest; and
- displaying with the generated two dimensional (2D) live ultrasound image and with the generated three dimensional (3D) ultrasound volume image an image of at least one needle to provide a representation of an angle and, a position of the at least one needle and at least one needle path in relation to the tissue of interest for the biopsy or seed localization to provide a representation of an angle and a position of the at least one needle for the biopsy or the seed localization in relation to the tissue of interest.
19. The method for ultrasound imaging in a biopsy or a seed localization procedure according to claim 18, further comprising the steps of
- positioning a seed marker in a location in the tissue of interest using ultrasound guidance based on the generated 2D live ultrasound image; and
- marking the positioned seed marker in the generated 2D ultrasound image in relation at the tissue of interest.
20. The method for ultrasound imaging in a biopsy or a seed localization, procedure according to claim 19, further comprising the step of:
- displaying the positioned seed marker in the generated 3D ultrasound volume image of the tissue of interest.
21. The method for ultrasound imaging in a biopsy or a seed localization procedure according to claim 18, further comprising the step of:
- displaying in the generated 2D live ultrasound image and in the generated 3D ultrasound volume image a plurality of needle paths in relation to the tissue of interest for each of a plurality of biopsies.
Type: Application
Filed: Apr 23, 2018
Publication Date: Oct 25, 2018
Applicant: BEST MEDICAL INTERNATIONAL, INC. (Springfield, VA)
Inventor: Vineet Gupta (Wexford, PA)
Application Number: 15/960,249