System, method and device for positioning a target located within soft tissue in a path of an instrument
A system, method and device for positioning a target located within soft tissue in a path of an instrument inserted into the soft tissue includes: a manipulation system including a plurality of force applicators positioned around the soft tissue containing the target; an image acquisition system including an imaging probe for obtaining data for generating an image of the soft tissue containing the target; a detection means for detecting deflection of the target using the data from the imaging probe; and a control means for actuating the plurality of force applicators to apply forces on the soft tissue in response to a detected deflection of the target to move the target back in line with the path of the instrument. In an exemplary embodiment, the soft tissue is a breast, the imaging probe is an ultrasound imaging probe, and the instrument is a biopsy needle.
Latest Vanderbilt University Patents:
- Substituted indole Mcl-1 inhibitors
- Positive allosteric modulators of the muscarinic acetylcholine receptor M4
- Moment arm extension system for exosuit
- Generation of human allergen- and helminth-specific IgE monoclonal antibodies for diagnostic and therapeutic use
- Inks for nanofiber fuel cell electrode and membrane-electrode-assemblies, and methods of ink formulations
This application claims the benefit of U.S. Provisional Patent Application No. 60/938,605, filed May 17, 2007, and U.S. Provisional Patent Application No. 60/971,745, filed Sep. 12, 2007, the entire disclosures of which are hereby incorporated by reference.
STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENTNot applicable.
THE NAMES OF THE PARTIES TO A JOINT RESEARCH AGREEMENTNot applicable.
INCORPORATION-BY-REFERENCE OF MATERIAL SUBMITTED ON A COMPACT DISCNot applicable.
TECHNICAL FIELDThe presently-disclosed subject matter relates to systems, methods, and devices for positioning a target located within soft tissue in a path of an instrument, including systems, methods, and devices for performing breast tissue biopsies.
BACKGROUNDBreast cancer is the most common cancer among American women and the second leading cause of cancer death in women. American Cancer Society, 2007 [1]. However, early detection of breast cancer has been proven to reduce mortality by about 20% to 35%. Elmore et al., 2005 [2]. Histopathological examination is considered to be the “gold standard” for definitive diagnosis of cancer but requires tissue samples that are collected through biopsy. Of the two major approaches for breast biopsy, needle biopsy and open excisional biopsy, needle biopsy is the most commonly practiced because it is less traumatic, produces little or no scar, allows quicker recovery, and is less expensive.
Despite many benefits of needle biopsy, there are significant technical challenges concerning accurate steering and precise placement of a biopsy needle at the target in breast tissue. To successfully remove a suspicious small targeted lump (e.g., less than 5 mm in diameter) various issues must be addressed, such as architectural distortion and target deflection during needle insertion and poor maneuverability of the biopsy needle. These issues are even more important when the collection of a large and intact core becomes necessary for histopathological diagnosis. Currently, large core samples are collected using large needles such as a 14-gauge (2.1 mm in diameter) true cutting needle, a 10-gauge (3.4 mm in diameter) vacuum-assisted needle, and other radiofrequency (RF) cutting instruments (EN-BLOC® and RUBICOR®) that increase insertion force significantly.
Although mammography, sonography, and magnetic resonance imaging (MRI) techniques have significantly improved early detection of breast cancer, accurate placement of a biopsy needle at the target location and reliable collection of target tissue remain challenging tasks.
Needle biopsies are guided by either stereotactic mammography, magnetic resonance imaging (MRI) or ultrasound (US) imaging. Sonography is the most widely used imaging technique because of its real-time capability and cost-effectiveness. For a few years three dimensional (3D) US systems have been available, but they are not as widely used as two dimensional (2D) US systems because of their high cost. 3D reconstruction algorithms have also been developed for real-time 3D rendering of a volume with image data acquired using a 2D US probe. Solberg et al., 2007 [3]. Real-time 3D reconstruction uses pixel data from the current US image slice to update the 3D volume. Hence the entire 3D volume cannot be reconstructed in real-time. To overcome this, researchers have developed techniques to extrapolate 3D volume data using 2D image slices. This technique is not applicable to breast biopsies due to the deformation of the breast during needle insertion.
Despite numerous benefits of needle biopsy, however, there are significant challenges concerning accurate steering and precise placement of a biopsy needle at the target (the word target as used herein refers to a tumor, a lesion or just a suspected region of tissue) in breast tissue. First, as the needle is inserted, large tissue deformation causes the target to move away from the line of insertion of the needle. DiMaio et al., 2003 [4]. This may necessitate multiple insertions at the same biopsy site to successfully sample the target. Second, current state-of-the-art US guided biopsy technique is highly dependent on the skill of the surgeon. The surgeon performs this procedure by holding the US probe with two or three fingers of one hand while using the other two or three fingers to stabilize the breast, and inserts the needle with the other hand. Since 2D sonography only provides image of a planar cross-section, if the target moves out of plane of the US probe, the surgeon has to continuously reorient the US probe to keep the needle and the target in the imaging plane while inserting the needle. It is critical to orient the imaging plane parallel to the needle, otherwise a false impression of the needle tip causes sampling errors. Since stabilization of the breast is problematic and steering of the needle inside the breast is extremely difficult, many insertion attempts may be required to successfully sample the target. This may cause architectural damage to the tissue, excessive bleeding obscuring the guiding images, surgeons' fatigue and patient discomfort.
Currently available commercial biopsy instruments (Mammotome®, Vacora® etc.) do not compensate for target movement during needle insertion. Robotic systems to improve the accuracy of needle insertions (see: Stoianovici et al. 1998 [5] and Cleary et al., 2001 [6]) do not provide real-time trajectory correction to overcome error due to target movement. In Okazawa et al., 2005 [7], DiMaio et al., [8] and Glozman et al., 2007 [9], steerable needle techniques are presented that allow steering the tip of the needle towards the target during insertion. Steerable devices can only be used with small caliber needles and hence are unsuitable for core needle biopsies. A visually controlled needle-guiding system is developed in Loser et al., 2000 [10], for automatic or remote controlled percutaneous interventions. Though this system potentially reduces the number of insertions required to sample the target, maneuvering a needle inside the breast causes tissue damage. In Azar et al, 2002 [11] and Alterovitz et al., 2003 [12], a finite element model of the breast is used to predict the movement of the target. The needle path is planned based on this prediction to accurately sample the target. To get an accurate prediction of the movement of the target, finite element analysis requires the geometric model and mechanical properties of the breast. In [11], the average time for computation is 29 minutes.
Researchers have developed robotic systems to alleviate the difficulty associated with acquiring US images during medical procedures. A force controlled robotic manipulator for performing cardiovascular 3D US image acquisition has been presented in Pierrot et al, 1999 [14]. Teleoperated master/slave robotic systems have been developed that enable remote acquisition of US images. See: Masuda et al., 2001 [15] and Vischis et al., 2003 [16]. A needle driver robot is presented in Hong et al., 2004 [17] where two degrees of freedom (DOF) in the US image plane are controlled through visual servoing. In this approach the needle is constrained to lie in the US image plane for visual feedback. This idea is extended in Vitrani et al., 2005 [18], where the controlled instrument is not constrained to lie in a plane but has to intersect with the US image plane. An image guided robot for positioning the US probe and tracking a target in real-time has been developed for diagnostic US Abolmaesumi et al., 2002 [19]. The robot controller, US image processor and the operator have shared control over the robot for guiding the US probe.
Even though these systems greatly reduce the difficulty of acquiring US images, the target cannot be tracked in real-time if it moves out of the imaging plane of the probe. In Krupa et al., 2007 [20], a speckle decorrelation technique is presented for estimating out-of-plane motion of a target. Simulation results presented assume rigid motion of internal tissue to preserve correlation between successive image planes. Due to needle insertion and target manipulation, large tissue deformation occurs inside the breast which prohibits application of this technique.
Force sensors are typically used to ensure contact between the surface and the US probe [14][16][19].
A hybrid controller is needed for coordinating multiple systems for robotic biopsy is needed. There has been some work on developing such hybrid controllers in other non-analogous fields, such as industrial robotics, medicine and manufacturing (see: Antsaklis, 2000 [23]), however no work has been done with respect to such a system for biopsy purposes.
BRIEF SUMMARY OF THE INVENTIONThe system, method and device described herein guide a target towards a line of insertion of an instrument, such as a needle, instead of steering the instrument towards the target during insertion. The system, method and device use a target manipulation system that positions the target inline with the instrument during insertion. The real-time target manipulation system presented is a set of position controlled force applicators. These applicators are placed around the soft tissue, such as a breast, during the instrument insertion procedure. They control the position of the target, by applying forces on the surface of the soft tissue, such that the target is always placed inline with the instrument. A PI or PID controller minimizes the tracking error in the position of the target. In this approach, instrument insertion force is treated as a disturbance to the system.
The system, method and device utilize a robust image acquisition system that automatically searches and recovers the target should it go out of a field of view of the imaging probe. A sensorless contact detection technique reliably detects contact transitions based on image data.
To coordinate the target manipulation system and the image acquisition system, a hybrid supervisory controller provides comprehensive assistance during procedures.
The robotic device has the following advantages: (a) success rate of the procedure (as defined by the number of insertions required at a particular site to successfully sample the target) will be increased since the target is accurately positioned inline with the instrument; (b) since instrument alignment, imaging probe positioning, target localization and soft tissue stabilization are automated, it will likely minimize fatigue of the surgeon; (c) since the instrument is not steered inside the soft tissue, and number of insertions reduced, tissue damage is also minimized (patient discomfort is reduced) and the structural integrity of the tissue specimen is preserved; and (d) by improving accuracy of the procedure, it will potentially enhance the diagnostic outcome by reducing false negatives.
Advantageously, in addition to the needle biopsy procedure described herein, the device, system and method of the invention has utility with respect to at least the following additional procedures: ablation; cryotherapy; radioactive seed placement; targeted drug injection; and minimally invasive surgery.
This Summary describes several aspects of the presently-disclosed subject matter, and in many cases lists variations and permutations of these aspects. This Summary is merely exemplary of numerous and varied embodiments. Mention of one or more representative features of a given embodiment is likewise exemplary. Such an embodiment can typically exist with or without the feature(s) mentioned; likewise, those features can be applied to other embodiments of the presently-disclosed subject matter, whether listed in this Summary or not. To avoid excessive repetition, this Summary does not list or suggest all possible combinations of such features.
Generally, a system, method and device for positioning a target located within soft tissue in a path of an instrument inserted into the soft tissue includes: a manipulation system including a plurality of force applicators positioned around the soft tissue containing the target; an image acquisition system including an imaging probe for obtaining data for generating an image of the soft tissue containing the target; a detection means for detecting deflection of the target using the data from the imaging probe; and a control means for actuating the plurality of force applicators to apply forces on the soft tissue in response to a detected deflection of the target to move the target back in line with the path of the instrument.
In one aspect of the invention, the imaging probe is mounted on a drive mechanism for moving the imaging probe in at least two dimensions with respect to the soft tissue, and the control means is further for actuating the drive mechanism to move the imaging probe to maintain contact with the soft tissue when a loss of contact between the imaging probe and the soft tissue is detected. Also, the control means actuates the drive mechanism to move the imaging probe to track the target if the target moves out of a field of view of the imaging probe.
In another aspect, the system further includes an instrument guidance device for fixing the path of the instrument. The instrument guidance device may further have an instrument path sensor for measuring a characteristic of the path of the instrument. The control means may utilize the characteristic of the path of the instrument in actuating the plurality of force applicators to place the target in line with the path for insertion of the instrument.
According to yet another aspect, the control means includes a target manipulation system controller for controlling the manipulation system, an image acquisition system controller for controlling the image acquisition system, and a high level supervisory controller for coordinating the target manipulation system controller and the image acquisition system controller based on events. The control means may further include an interface between the high level supervisory controller, and the target manipulation system controller and the image acquisition system controller. The interface is for converting between continuous-time signals and sequences of symbols representing tasks. The high level supervisory controller may further have a process monitoring module for monitoring event information received from the target manipulation system controller and the image acquisition system controller via the interface, and a decision making module for determining tasks to implement in response to the events. The decision making module may further send control commands to the target manipulation system controller and the image acquisition system controller via the interface.
In accord with an important implementation, each of the plurality of force applicators includes an end effector positioned for engaging a surface of the soft tissue, an end effector actuator for effecting movement to the end effector, and an end effector position sensor for sensing a position of the end effector for use by the control means. The drive mechanism for moving the imaging probe may have an imaging probe position sensor for sensing a position of the imaging probe for use by the control means. Further, the instrument guidance device may have an instrument path sensor for sensing a characteristic of the path of the instrument for use by the control means.
In an exemplary embodiment, the soft tissue is a breast, the target is a suspected tumor, the imaging probe is an ultrasound imaging probe, and the instrument is a biopsy needle.
Other features and advantages of the invention with be set forth in, or apparent from, the detailed description of exemplary embodiments of the invention found below.
The details of one or more embodiments of the presently-disclosed subject matter are set forth in this document. Modifications to embodiments described in this document, and other embodiments, will be evident to those of ordinary skill in the art after a study of the information provided in this document. The information provided in this document, and particularly the specific details of the described exemplary embodiments, is provided primarily for clearness of understanding and no unnecessary limitations are to be understood therefrom. In case of conflict, the specification of this document, including definitions, will control.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the presently-disclosed subject matter belongs. Although any methods, devices, and materials similar or equivalent to those described herein can be used in the practice or testing of the presently-disclosed subject matter, representative methods, devices, and materials are now described.
Following long-standing patent law convention, the terms “a”, “an”, and “the” refer to “one or more” when used in this application, including the claims.
Unless otherwise indicated, all numbers expressing quantities of ingredients, properties such as reaction conditions, and so forth used in the specification and claims are to be understood as being modified in all instances by the term “about”. Accordingly, unless indicated to the contrary, the numerical parameters set forth in this specification and claims are approximations that can vary depending upon the desired properties sought to be obtained by the presently-disclosed subject matter.
As used herein, the term “about,” when referring to a value or to an amount of mass, weight, time, volume, concentration or percentage is meant to encompass variations of in some embodiments ±20%, in some embodiments ±10%, in some embodiments ±5%, in some embodiments ±1%, in some embodiments ±0.5%, and in some embodiments ±0.1% from the specified amount, as such variations are appropriate to perform the disclosed method.
The presently-disclosed subject matter includes a system, method and device for positioning a target located within soft tissue in a path of an instrument.
In some embodiments, the target can be a suspected tumor and the instrument can be a biopsy needle, e.g., a breast biopsy. For examples, during an image-guided breast biopsy the subject can lie prone with the breast placed in an opening on a biopsy table. The clinician then inserts a biopsy needle either directly into the biopsy site or through an incision to remove a target sample (e.g., a suspected tumor or portion thereof). With reference to
For convenience, the description below will discuss exemplary systems, methods and devices for positioning a target located within soft tissue primarily in terms of an exemplary breast biopsy application, but it is understood that no unnecessary limitations are to be read into the scope of the claimed invention because of this exemplary application.
As described below, in use a soft tissue organ, such as a breast, is received within the manipulation system 12 and imaged by the image acquisition system 14. An instrument, such as a biopsy needle, on the instrument guidance device 16 is inserted in to the breast along an insertion path fixed by the instrument guidance device 16. Deflection of the target is detected by a detection means (shown and described below) using data from the image acquisition system 14, and the control means causes the manipulation system 12 to manipulate the soft tissue organ to move the target back in line with the insertion path for interception by the instrument. The image acquisition system may use stereotactic sensors, a magnetic resonance imager (MRI), an ultrasound (US) sensor, or other known imaging devices, although, as discussed below, aspects of the exemplary system are particularly advantageous for an imaging probe, such as a US sensor, that generates a two-dimensional image corresponding to an imaging plane within the soft tissue organ.
The rotary base 20 is rotatable about the central axis 21 and lockable in place.
The plurality of telescoping arms 22a, 22b, 22c are extendable in a vertical direction and lockable in place.
The sliders 28a, 28b, 28c are preferably linked for coordinated radial movement along the slots 26a, 26b, 26c with respect to the central axis 21. The sliders 28a, 28b, 28c are also lockable in place.
Each of the force applicators 32a, 32b, 32c has a respective end effector (i.e. contoured link) 34a, 34b, 34c positioned for engaging a surface of the breast and a end effector actuator (i.e. drive mechanism) 36a, 36b, 36c for effecting linear movement to the respective end effector 34a, 34b, 34c. Further, each of the force applicators 32a, 32b, 32c includes an integral end effector (contoured link) position sensor for measuring or sensing the position of the respective end effector 34a, 34b, 34c.
A handle assembly 38 connects to the sliders 28a, 28b, 28c.
The exemplary manipulation system 12 shown in
In the exemplary manipulation system 12, the force applicators 32a, 32b, 32c weigh 37 g, have dimensions of 94 mm×18 mm×18 mm, have a stroke of 50 mm and have a force capability of 45 N with maximum speed of 2.5 mm per second. The end effector position sensors are on-board servo controllers for precise positioning of the end effectors 34a, 34b, 34c. The end effectors 34a, 34b, 34c are contoured for maximum contact with the surface of the breast. Each end effector 34a, 34b, 34c is made of semi-hard plastic to avoid rupturing the skin. The force applicators 32a, 32b, 32c are mounted on the support links 30a, 30b, 30c which are driven using the sliders 28a, 28b, 28c for coordinated movement of the force applicators 32a, 32b, 32c along the radial direction (P3, P4, P5). The slots 26a, 26b, 26c of the actuator base 24 are 120° apart for accommodating radial movement of the support links 30a, 30b, 30c. The actuator base 24 can be moved in the vertical direction (P2) using the telescoping columns 22a, 22b, 22c. The exemplary manipulation system 12 is placed on the rotary base 20 that can be rotated (P1) about the central axis 21 and locked in place.
The exemplary manipulation system 12 can be setup in two easy steps: (1) by locking the sliders 28a, 28b, 28c in place using locating pins (not shown) and moving the handle assembly 38 up/down, the actuator base 24 can be positioned such that the force applicators 32a, 32b, 32c are in the plane of the target; and (2) by locking the telescoping columns 22a, 22b, 22c and moving the handle assembly 38 up/down the sliders 28a, 28b, 28c can be moved radially in a coordinated manner until the contoured links 34a, 34b, 34c make contact with the surface of the breast. The manipulation system 12 can also be optionally rotated for convenient access based on the desired location of the instrument insertion site.
The position of the force applicators 32a, 32b, 32c is adjusted to accommodate different breast sizes. Setup time for the manipulation system 12 is minimal since only one control knob needs to be adjusted for positioning the force applicators 32a, 32b, 32c. The open structure of the manipulation system 12 facilitates easy positioning and access for the instrument guidance device 16 and image acquisition system 14 (see
In the exemplary image acquisition system 14, the horizontal drive mechanism 40 is an elongated slide 46 movably mounted on the actuator base 24 (
The vertical drive mechanism 42 of the exemplary image acquisition system 14 is a telescoping finger 54 extending upward from the pinion mount 50. The telescoping finger 54 has a telescoping finger linear actuator 56 for effecting vertical movement of an upper end 58 of the telescoping finger 54. The linear actuator 56 also has an embedded sensor for measuring the vertical position of the upper end 58 of the telescoping finger 54.
The image acquisition system two-axis coupling 43 is supported by the upper end 58 of the telescoping finger 54. The two-axis coupling 43 has a first pivot joint 60 for allowing rotation about a first horizontal radial axis (the u-axis, as shown) and a second pivot joint 62 for allowing rotation about a second horizontal axis orthogonal to the first horizontal radial axis (the v-axis, as shown). Embedded sensors measure the position of the first pivot joint 60 and the second pivot joint 62. The two-axis coupling 43 supports the imaging probe 44.
The exemplary image acquisition system 14 has two DOA (A4, A5) and three DOF (P6, P7, P8). The two-axis coupling 43 has two DOF (P7 and P8). The imaging plane of the imaging probe 44 is the u-v plane. The two-axis coupling 43 can rotate about the u-axis (roll, P7) and the v-axis (pitch, P8) for orienting the imaging probe 44 such that the imaging plane is parallel to the surgical instrument insertion path. Vertical motion (A5) is achieved using the telescoping finger linear actuator 56 mounted inside the telescoping finger 42. The telescoping finger 42 is mounted on the elongated slide 46 that is moved using the lead screw 48 driven by the motor 52 for horizontal motion (A4). The exemplary image acquisition system 14 moves the imaging probe 44 along A4 to make and maintain contact with the breast, and also slides the imaging probe 44 along A5 for tracking the location of the target if the target moves out of the imaging plane (u-v). Collectively, the image acquisition system sensors make up an imaging probe position sensor for sensing a position of the imaging probe 44.
The guide rail 64 is movably mounted on the actuator base 24 (
The telescoping finger 66 is mounted on and slidable along the guide rail 64. The telescoping finger 66 extends upward from the guide rail 64, and has an upper end 74 that is extendable in a vertical direction. The telescoping finger 66 defines a telescoping finger vertical axis that is parallel to the central axis. Embedded sensors provide horizontal and vertical positional information of the telescoping finger upper end 74.
The two-axis coupling 68 is supported on the telescoping finger upper end 74, and includes a third pivot joint 78 for allowing rotation about the telescoping finger vertical axis 76, and a fourth pivot joint 80 for allowing rotation about a horizontal axis 80. Embedded sensors provide positional information of the two-axis coupling 68 with respect to the telescoping finger vertical axis 76 and the horizontal axis 80.
The instrument slide 70 is supported by the two-axis coupling 68 and slidably receives the instrument 72. Another embedded sensor provides positional information for the instrument 72 with respect to the instrument slide 70.
The instrument guidance device embedded sensors, collectively, make up an instrument path sensor for sensing a characteristic of the path of the instrument 72.
The instrument guidance device 16 is a five DOF (P9-P13) platform which facilitates positioning and orienting of the surgical instrument insertion direction based on feedback from the imaging probe 44 of the target location and other critical anatomical structures. The instrument guidance device 16 is sufficient for positioning and orienting any object in 3D Euclidean space. Thus, the surgical instrument 72 is capable of reaching any target in the breast.
The guide rail 64 allows adjustment of the distance (P9) of the instrument guidance device 16 (P9) from the breast. The telescoping finger 66 facilitates height adjustment (P10). The two-axis coupling 68 provide yaw and pitch motion (P12 and P13) for orientation. The surgical instrument slide 70 is used for insertion of the surgical instrument into the breast along the insertion path direction (P11).
Returning now to
The exemplary system has five actuators, as follows: three actuators (A1, A2, A3) for the manipulation system 12 and two actuators (A4, A5) for the image acquisition system 14. The exemplary system also has 10 potentiometers for measuring joint angles of the various links: three potentiometers measure the extension of the manipulation system linear actuators 32 (A1, A2, A3), five potentiometers measure joint positions of the image acquisition system 14 (A4, A5, P6, P7, P8), and two potentiometers measure joint positions of the surgical instrument guidance device (P12-P13). In one embodiment, a 16 channel, 16-bit analog input board (PCI-DAS6034, Measurement Computing) is used for reading the potentiometer data for input to a control algorithm, and an eight channel, 16-bit analog output board (PCI-DDA08/16, Measurement Computing) is used for controlling the actuators (A1-A5). The exemplary system is preferably mounted on a mobile platform for ease of transportation.
Advantageously, the exemplary system aids the clinician in biopsy and other surgical procedures. The first step is identifying the target through the image from the imaging probe 44. The clinician operates the imaging probe 44 to identify the target on an imaging screen. Once the target is identified and located, the image acquisition system 14 tracks the target. Next, the insertion path of the surgical instrument is manually set by the clinician using the instrument guidance device 16. Once the surgical instrument guidance device is set, the insertion path is automatically determined from the sensor readings. The manipulation system 12 is automatically updated with the target location information as obtained from the image and the insertion path information. Now as the clinician begins to insert the surgical instrument, if the target deflects away from the needle path the manipulation system generates appropriate controlled force to move the target back to the insertion path (by computing the distance between the target and the surgical instrument path, as discussed below).
II. CONTROL FRAMEWORKDuring the biopsy procedure, the needle 96 is inserted into the breast 98 at a shallow angle (away from the chest wall) to the horizontal plane containing the target 100. The desired target position is the point where the line of insertion (of the needle 96) intersects the plane containing the target 100. While one can choose any plane that contains the target 100 and has an intersection with the line of needle insertion, the horizontal plane is chosen for simplicity. The desired target position is determined by a planner based on the actual target 100 location and needle 96 direction. The target manipulation system controller 90 (
Through extensive simulations and experimental testing it has been determined that PI (proportional-integral) and PID (proportional-integral including derivative feedback) controllers are suited for the application. The PI control law does not require knowledge of any geometric or mechanical properties of the breast. Filtered target position data is differentiated to provide derivative control action.
The target manipulation system controller 90 may include derivative feedback (PID control) in which target position data is differentiate filtered to provide derivative control action. This reduces settling time for the target 100 position.
B. Low Level Controller for US ImagingReturning to
The direction of maximum likelihood determined above provides an intelligent starting point to search for the target 100. It is not possible to accurately predict the direction of target movement due to complex nature of interaction dynamics and tissue inhomogeneity within the breast 98. Once the direction of maximum likelihood is determined, the imaging probe vertical drive mechanism 42 (
It should be noted that other search algorithms, such as a brute force search algorithm, may be utilized to locate the target 100. As such, other search algorithms would be considered to be within the scope of the invention as claimed.
Due to high acoustic impedance of air, the imaging probe 44 has to be in contact with the breast 98 for imaging the target 100. Surface deformation due to target manipulation and needle insertion may result in the imaging probe 44 losing contact with the breast 98. Loss of contact has to be detected and the imaging probe 44 has to be moved to reestablish contact with the breast 98. The region in the image close to the edge (where the Imaging probe 44 makes contact with the surface) is used to detect the contact state of the Imaging probe 44. This region is extracted from the US image and Otsu's method is used to estimate the grayscale threshold for the region. When the imaging probe 44 is in contact with the surface the threshold is small, and when the imaging probe 44 breaks contact with the surface the threshold increases. This change in the threshold is used to infer the contact state of the imaging probe 44.
B. High Level Supervisory ControllerReturning to
The high level supervisory controller 94 includes a discrete decision process that is typically a discrete-event system (DES) described by a finite state automaton. The proposed high level supervisory controller 94 makes decisions about the task that is required for robotic assistance during biopsy. The high level supervisory controller 94 and a plant 108 (an emergency stop 110, as well as the target manipulation system controller 90 and the image acquisition system controller 92) cannot communicate directly in a hybrid control system because each utilizes different types of signals. Thus an interface 112 is required which can convert continuous-time signals to sequences of symbols and vice versa. In order for the high level supervisory controller 94 to decide the necessary control actions, the state information from the plant 108 is observed by a process monitoring module through the interface 112. The interface 112 triggers events pertinent to the task and communicates to a process monitoring block of the high level supervisory controller 94. Once these events are triggered, a decision making module of the high level supervisory controller 94 determines what actions need to be taken in response to these events. The high level supervisory controller 94 instructions are sent to the target manipulation system controller 90 and the image acquisition system controller 92 through the interface 112, which then execute the actions. This architecture is flexible and extendable in the sense that new events can be included and detected by simply monitoring the additional state information, and accommodated by introducing new low level controllers.
The high level supervisory controller 94 is a discrete-event system (DES) deterministic finite automaton, which is specified by:
D=({tilde over (P)},{tilde over (X)},{tilde over (R)},ψ,λ)[22]. (1)
Here {tilde over (P)} is the set of discrete states. Each event is represented as a plant symbol, where {tilde over (X)} is the set of such symbols, for each discrete state. The next discrete state is activated based on the current discrete state and the associated plant symbol using the following transition function: ψ: {tilde over (P)}×{tilde over (X)}→{tilde over (P)}. In order to notify the low level controller the next course of action in the new discrete state, the controller generates a set of symbols, called control symbols denoted by {tilde over (R)} using an output function: λ: {tilde over (P)}→{tilde over (R)}. The action of the high level control is described by the following equations:
{tilde over (p)}j[n]=ψ({tilde over (p)}i[n−1],{tilde over (x)}k[n]), (2)
{tilde over (r)}c[n]=λ({tilde over (p)}j[n]), (3)
where {tilde over (p)}i, {tilde over (p)}jε{tilde over (P)}, {tilde over (x)}kε{tilde over (X)}, {tilde over (r)}cε{tilde over (R)} represent the index of discrete states. k and c represent the index of plant symbols and control symbols respectively. n is the time index that specifies the order of the symbols in the sequence.
The high level supervisory controller 94 first detects state information from the plant 108 through the interface 112, and then determines the actions to be taken in response to this information. The state information from the plant 108 can be a continuous signal or a discrete value. Let SMn, SIn and SEn represent the sets of robot and manipulation, imaging and emergency state information respectively. The signal detected from the manipulation system 12 is: a) Actuator overload (SM1, 0—actuators not overloaded; 1—actuators overloaded). Three signals from the US imaging state information are detected: a) contact condition of the US probe (SI1, 0—contact; 1—no contact); b) presence of target in the image plane (SI2, 0—target in image plane; 1—target not in image plane); and c) US probe position (SI3). An emergency signal is also monitored to detect when the stop button is pressed (SE1, 0—no emergency; 1—emergency stop). Position of the imaging probe 44 is the only continuous signal, all other signals are discrete values. In this procedure, the following plant states are defined: {tilde over (P)}: manipulation, contact initiation, target tracking and stop. Manipulation ({tilde over (p)}1) implies that target manipulation is activated. Contact initiation ({tilde over (p)}2) means that the US probe is moved to make contact with the surface of the breast. Target tracking ({tilde over (p)}3) means that the search algorithm is activated to locate the target. Stop ({tilde over (p)}4) means that the entire system is disabled.
State information from the plant is monitored to trigger relevant events to modify the task. When these events are triggered, the interface 112 provides the necessary plant symbol ({tilde over (x)}) to the high level supervisory controller 94. Six events are defined for the high level supervisory controller 94. However, the number of events can be easily extended. Events are reset at the beginning of task execution. Additionally, the triggered event is reset when a new event occurs.
The symbol “Limits” is used to represent the motion limits of the range of movement of the imaging probe 44. When the target manipulation force applicators 32a, 32b, 32c are not overloaded, the imaging probe position is within limits and the emergency stop is not detected, event E1 is triggered. When any one of the above conditions is not satisfied, event E2 is triggered. When the US probe breaks/makes contact with the surface, events E3/E6 are triggered. When the target moves out of/into the imaging plane, events E4/E5 are triggered. Plant symbols are designed based on the events shown in Table 1.
When an event is triggered, the corresponding plant symbol ({tilde over (x)}) is generated by the interface 112. The current state ({tilde over (p)}) and the plant symbol ({tilde over (x)}) are used by the high level supervisory controller 94 to determine the next state. The control mechanism of the proposed high level supervisory controller 94 is shown in
λ({tilde over (p)}i)={tilde over (r)}i for i=1, 2, 3, 4 (4)
Only one state is active at a time and, therefore, a control symbol is uniquely assigned for each state. Any event that generates a plant symbols along with the current state information determines the next state and as a result the corresponding control symbol. Since the low-level assistive controller cannot interpret the control symbols, the interface 112 converts them into enable/disable signals based on the control policy. The control symbols and their corresponding control policies are given in Table 2.
The presently-disclosed subject matter is further illustrated by the following specific but non-limiting examples. The following examples may include compilations of information that are representative of information gathered at various times during the course of development and experimentation related to the presently-disclosed subject matter.
III. EXPERIMENTAL RESULTS A. Phantom PreparationA deformable plastic phantom with a stiff inclusion (a plastic insert placed in the phantom to simulate a tumor) is created to test the efficacy of the controller in positioning the inclusion at a desired location. The phantom is a cylindrical structure (radius 60 mm, height 35 mm) made of deformable thermoplastic. The inclusion is a plastic sphere (diameter 14 mm) that is much stiffer than the phantom. The inclusion is the target in the following experiments. The material used to make the phantom consists of two parts: super soft thermoplastic (Plastisol) and softener. The ratio of the two parts determines the elastic properties of the phantom. Increasing the amount of softener decreases the stiffness of the phantom.
where σn* is the nominal stress and εn is the nominal strain. b* and m* are the exponential fit parameters. The Young's modulus is given by
E=b*em*ε
The ability to control the position of a target inside the breast, and to dynamically track the location of a target in real-time using a 2D US probe have been investigated using phantoms. An experimental testbed is constructed to test the efficacy of the proposed technique in positioning the target at a desired location by applying force on the surface of the phantom.
Position feedback of the target is obtained using a 2D US imaging system Toshiba ECCO-CEE (model SSA-340A). Image frames from the US system are sent to a computer (1.6 GHz and 2 GB RAM) running image processing algorithm in Matlab. Image frames are processed to extract position data of the inclusion. The target position data is communicated serially to a microcontroller (Freescale 68HC912B32, 8 MHz clock frequency). The microcontroller outputs this data in a 16 bit parallel format. This data is read by another computer (1.6 GHz and 1 GB RAM) using a data acquisition card (Measurement computing PCIM DDA06/16). This computer runs the control algorithm and outputs control signals to power amplifiers for driving the linear actuators. The linear actuators are lead screw driven with inbuilt potentiometers. They have a no load speed of 50 mm/s and a load capacity of 88N at 25% duty cycle.
C. Experiments Experiment 1 To Demonstrate Accuracy of Target Positioning Using External Force on the Surface of the Phantom During Needle InsertionFor the first experiment, the target is initially located at the origin and the needle is inserted along the Y axis. Due to inhomogeneity in the phantom the target moves away from the needle path during insertion. Linear actuators are positioned along the X axis to steer the target towards the needle path (Y axis). During this experiment the force applied by the needle on the phantom is treated as a disturbance to the system. The task of the controller is to position the target on the Y axis. The position of the target along the Y axis is not controlled since the needle will intersect with the target no matter where it is located on the Y axis. In this experiment a PI controller is used for controlling the position of the target.
Since the movement of the target along the Y direction is small (˜2 mm), the target is in the US image plane during the entire experiment. The US probe did not break contact with the surface of the phantom. Emergency stop is also not activated during the experiment. Hence, the discrete event plant (
For the second experiment, linear actuators are used to position the target during a needle insertion task. The target is initially located at the origin and the needle is inserted along the Y axis. The US probe is placed over the phantom such that the imaging plane (XZ plane) is orthogonal to the needle insertion direction. During a biopsy procedure, the imaging plane has to be oriented along the needle insertion direction. In this experiment the US probe is placed orthogonally to demonstrate the ability of the system to track the target when it moves out of the imaging plane during a real-time needle insertion procedure.
In this experiment, during needle insertion, the target moves away from the needle path and out of the imaging plane of the US probe. Two linear actuators positioned along the X axis are used to steer the target towards the needle path (Y axis), similar to Experiment 1. One linear actuator is also used to move the US probe along the Y axis. The task of the controller is to position the target on the Y axis. The position of the target along Y axis is not controlled since the needle will intersect with the target no matter where it is located on the Y axis. In this experiment a PID controller is used for controlling the position of the target.
Two observations from the result of the above experiment are made: 1) the PID controller is more effective than the PI controller for target manipulation (20 seconds (
The experimental testbed used in Experiment 1 and Experiment 2 is designed to demonstrate the efficacy of target manipulation and tracking. During the Experiment 1 and Experiment 2, the US probe did not break contact with the surface of the phantom. During a breast biopsy procedure it is possible that the US probe may break contact due to surface deformation. Hence the contact state has to be monitored to maintain contact with the breast.
Experiment 3 To Demonstrate Reliability of the Contact Detection AlgorithmIn this experiment target manipulation and tracking are disabled. Position of the phantom is fixed and the US probe is moved normal to the surface of the phantom.
The system and method describe herein has several potential advantages, including: (1) success rate of the procedure is increased since the target can be accurately positioned inline with the needle; (2) the entire procedure is fast, making it clinically viable; (3) since the needle is not steered inside the breast, tissue damage is minimized; (4) since multiple insertions are not required, the proposed technique will likely reduce surgeons' fatigue and patient discomfort and preserve structural integrity of the tissue specimen; (5) by improving accuracy of the procedure, it will potentially enhance the diagnostic outcome by reducing false negatives; and (6) geometric and mechanical properties of the breast are not required for precise positioning of the target.
Throughout this document, various references are mentioned. All such references are incorporated herein by reference, including the references set forth in the following list:
- [1] American Cancer Society, “Cancer Facts & Figures-2007,” http://www.cancer.org/downloads/STT/CAFF2007PWSecured.pdf, accessed on 19 May, 2007.
- [2] J. G. Elmore, K. Armstrong, C. D. Lehman, S. W. Fletcher, “Screening for Breast Cancer,” Journal of American Medical Association, Vol. 293, No. 10, pp. 1245-1256, March 2005.
- [3] O. V. Solberg, F. Lindseth, H. Torp, R. E. Blake, T. A. N. Hernes, “Freehand 3D ultrasound reconstruction algorithms—A review,” Ultrasound in Med. and Biol., Vol. 33, No. 7, pp. 991-1009, 2007.
- [4] S. P. DiMaio, S. E. Salcudean, “Needle insertion modeling and simulation,” IEEE Transactions on Robotics and Automation, Vol. 19, No. 4, October 2003.
- [5] D. Stoianovici, L. Whitcomb, J. Anderson, R. Taylor, and L. Kavoussi, “A modular surgical robotic system for image guided percutaneous procedures,” in Proc. MICCAI, Vol. 1496, LNCS, pp. 404-410, 1998.
- [6] K. Cleary, M. Freedman, M. Clifford, D. Lindisch, S. Onda, and L. Jiang, “Image-guided robotic delivery system for precise placement of theraputic agents,” J. Cont. Release, Vol. 74, No. 1, pp 363-368, 2001.
- [7] S. Okazawa, R. Ebrahimi, J. Chuang, S. E. Salcudean, R. Rohling, “Hand-held steerable needle device,” IEEE/ASME Transactions on Mechatronics, Vol. 10, pp 285-296, June 2005.
- [8] S. P. DiMaio, S. E. Salcudean, “Needle steering and model-based trajectory planning,” in Proc. MICCAI, Vol. 2878, LNCS, pp. 33-40, 2003.
- [9] D. Glozman, M. Shoham, “Image-guided robotic flexible needle steering,” IEEE Transactions on Robotics, Vol. 23, No. 3, pp 459-467, June 2007.
- [10] M. H. Loser, N. Navab, “A new robotic system for visually controlled percutaneous interventions under CT fluoroscopy”, in Proc. MICCAI, Vol. 1935, LNCS, pp. 887-896, 2000.
- [11] F. S. Azar, D. N. Metaxas, M. D. Schnall, “Methods for modeling and predicting mechanical deformations of the breast under external perturbations,” Medical image Analysis, Vol. 6, pp 1-27, 2002.
- [12] R. Alterovitz, K. Goldberg, J. Pouliot, R. Taschereau, I-C. Hsu, “Sensorless planning for medical needle insertion procedures,” in Proc. IEEE International Conference on Intelligent Robots and Systems, pp. 3337-3343, October 2003.
- [13] V. Mallapragada, N. Sarkar, T. Podder, “Robot assisted real-time tumor manipulation for breast biopsy,” IEEE International Conference on Robotics and Automation, May 2008 (accepted).
- [14] F. Pierrot, E. Dombre, E. Degoulange, L. Urbain, P. Caron, S. Boudet, J. Gariepy, J. Megnien, “Hippocrate: A safe robot arm for medical applications with force feedback,” Medical Image Analysis, Vol. 3, No. 3, pp. 285-300, 1999.
- [15] K. Masuda, E. Kimura, N. Tateishi, K. Ishihara, “Three-dimensional motion mechanism of ultrasound probe and its application for tele-echography system,” in Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1112-1116, 2001.
- [16] A. Vilchis, J. Troccaz, P. Cinquin, K. Masuda, F. Pellisier, “A new robot architecture for tele-echography,” in IEEE Transactions on Robotics and Automation, Vol. 19, No. 5, pp. 922-926, 2003.
- [17] J. Hong, T. Dohi, M. Hashizume, K. Konishi, N. Hata, “An ultrasound driven needle insertion robot for percutaneous cholecystostomy,” Physics in Med. and Biol., Vol. 39, No. 3, 441-455, 2004.
- [18] M. A. Vitrani, G. Morel, T. Ortmaier, “Automatic Guidance of a surgical instrument with ultrasound based visual servoing,” in Proc. IEEE International Conference on Robotics and Automation, pp. 510-515, 2005.
- [19] P. Abolmaesumi, S. E. Salcudean, W. Zhu, M. R. Sirouspour, S. P. DiMaio, “Image-guided control of a robot for medical ultrasound,” IEEE Transactions on Robotics and Automation, Vol. 18, No. 1, February 2002.
- [20] A. Krupa, G. Fichtinger, G. D. Hager, “Full motion tracking in ultrasound using image speckle information and visual servoing,” in Proc. IEEE International Conference on Robotics and Automation, pp. 2458-2464, 2007.
- [21] V. Mallapragada, N. Sarkar, T. Podder, “A Robotic System for Real-time Tumor Manipulation During Image Guided Breast Biopsy,” in Proc. IEEE International Conference on Bioinformatics and Bioengineering, pp. 204-210, October 2007.
- [22] P. J. Antsaklis, X. D. Koutsoukos, “Hybrid systems: review and recent progress,” In Samad, T., editor. Software Enabled Control: Information Technology for Dynamical Systems. NY: Wiley-IEEE; 2003.
- [23] P. J. Antsaklis, “A brief introduction to the theory and applications of hybrid systems,” in Proc. IEEE Special Issue on Hybrid Systems: Theory and Applications, Vol. 88, No. 7, pp. 879-887, 2000.
- [24] P. Wellman, “Tactile imaging”, PhD dissertation, Division of Engineering and Applied Sciences, Harvard University, MA, 1999.
It will be understood that various details of the presently disclosed subject matter can be changed without departing from the scope of the subject matter disclosed herein. Furthermore, the foregoing description is for the purpose of illustration only, and not for the purpose of limitation.
Claims
1. A system for positioning a target located within soft tissue, comprising:
- an instrument configured to be inserted into the soft tissue along a path;
- a manipulation system including a plurality of force applicators configured to be positioned around the soft tissue containing the target;
- an image acquisition system including an imaging probe for obtaining data for generating an image of the soft tissue containing the target;
- detection means for detecting deflection of the target using the data from the imaging probe; and
- a control means for actuating the plurality of force applicators to apply forces on the soft tissue in response to a detected deflection of the target to move the target back in line with the path of the instrument.
2. The system of claim 1,
- wherein the imaging probe is mounted on a drive mechanism for moving the imaging probe in at least two dimensions with respect to the soft tissue; and
- wherein the control means is further for actuating the drive mechanism to move the imaging probe to maintain contact with the soft tissue and to track the target if the target moves out of a field of view of the imaging probe.
3. The system of claim 1, further comprising an instrument guidance device for fixing the path of the instrument.
4. The system of claim 3, wherein the instrument guidance device further has an instrument path sensor for measuring a characteristic of the path of the instrument; and wherein the control means utilizes the characteristic of the path of the instrument in actuating the plurality of force applicators to place the target in line with the path for insertion of the instrument.
5. The system of claim 2, wherein the control means includes a target manipulation system controller for controlling the manipulation system, an image acquisition system controller for controlling the image acquisition system, and a high level supervisory controller for coordinating the target manipulation system controller and the image acquisition system controller based on events.
6. The system of claim 5, wherein the control means further includes an interface between the high level supervisory controller, and the target manipulation system controller and the image acquisition system controller, the interface being for converting between continuous-time signals and sequences of symbols representing tasks.
7. The system of claim 6, wherein the high level supervisory controller further has a process monitoring module for monitoring event information received from the target manipulation system controller and the image acquisition system controller via the interface, and a decision making module for determining tasks to implement in response to the events, and sending control commands to the target manipulation system controller and the image acquisition system controller via the interface.
8. The system of claim 1, wherein the soft tissue is a breast.
9. A method for positioning a target located within soft tissue in a path of an instrument inserted into the soft tissue, comprising:
- identifying the target using an image acquisition system including an imaging probe for obtaining data for generating an image of the soft tissue containing the target;
- positioning the soft tissue containing the target in a manipulation system including a plurality of force applicators positioned around the soft tissue;
- inserting the instrument into the soft tissue;
- detecting deflection of the target using the data from the imaging probe; and
- actuating the plurality of force applicators to apply forces on the soft tissue in response to a detected deflection of the target to move the target back in line with the path of the instrument.
10. The method of claim 9, wherein the imaging probe is mounted on a drive mechanism for moving the imaging probe to maintain contact with the soft tissue and to track the target if the target moves out of a field of view of the imaging probe.
11. The method of claim 10, wherein moving the imaging probe to track the target further includes locating the target by actuating the drive mechanism.
12. The method of claim 11, further comprising:
- detecting a loss of contact between the imaging probe and the soft tissue; and
- actuating the drive mechanism to reestablish contact between the imaging probe and the soft tissue.
13. The method of claim 9, wherein the soft tissue is a breast.
14. A device for positioning a target located within soft tissue, comprising:
- an instrument configured to be inserted into the soft tissue along a path;
- a manipulation system including a plurality of force applicators configured to be positioned around the soft tissue containing the target;
- an image acquisition system including an imaging probe for obtaining data for generating an image of the soft tissue containing the target;
- an instrument guidance device for fixing the path of the instrument;
- detection means for detecting deflection of the target using the data from the imaging probe; and
- control means for actuating the plurality of force applicators to apply forces on the soft tissue in response to detecting a deflection of the target to move the target back in line with the path of the instrument.
15. The device of claim 14,
- wherein the image acquisition system further includes a drive mechanism for moving the imaging probe; and
- wherein the control means is further for actuating the drive mechanism to move the imaging probe to maintain contact with the soft tissue and to track the target if the target moves out of a field of view of the imaging probe.
16. The device of claim 15, wherein the control means includes a manipulation system controller for controlling the manipulation system, an image acquisition system controller for controlling the image acquisition system, and a high level supervisory controller for coordinating the manipulation system controller and the image acquisition system controller based on events.
17. The device of claim 16, wherein each of the plurality of force applicators includes an end effector positioned for engaging a surface of the soft tissue, an end effector actuator for effecting movement to the end effector, and an end effector position sensor for sensing a position of the end effector for use by the control means.
18. The device of claim 17, wherein the drive mechanism for moving the imaging probe moves the imaging probe in at least two dimensions and has an imaging probe position sensor for sensing a position of the imaging probe for use by the control means.
19. The device of claim 18, wherein the instrument guidance device has an instrument path sensor for sensing a characteristic of the path of the instrument for use by the control means.
20. The device of claim 19, wherein the soft tissue is a breast, wherein the imaging probe is an ultrasound imaging probe, and wherein the instrument is a biopsy needle.
4515165 | May 7, 1985 | Carroll |
7496398 | February 24, 2009 | Nields et al. |
7731661 | June 8, 2010 | Salcudean et al. |
20050119568 | June 2, 2005 | Salcudean et al. |
20090143674 | June 4, 2009 | Nields et al. |
- Abolmaesumi, P., S. E. Salcudean, W. Zhu, M. R. Sirouspour, S. P. DiMaio, “Image-guided control of a robot for medical ultrasound,” IEEE Transactions on Robotics and Automation, vol. 18, No. 1, Feb. 2002.
- Alterovitz, R., K. Goldberg, J. Pouliot, R. Taschereau, I-C. Hsu, “Sensorless planning for medical needle insertion procedures,” in Proc. IEEE International Conference on Intelligent Robots and Systems, pp. 3337-3343, Oct. 2003.
- Antsaklis, P. J., “A brief introduction to the theory and applications of hybrid systems,” in Proc. IEEE Special Issue on Hybrid Systems: Theory and Applications, vol. 88, No. 7, pp. 879-887, 2000.
- Antsaklis, P. J., X. D. Koutsoukos, “Hybrid systems: review and recent progress,” In Samad, T., editor. Software Enabled Control: Information Technology for Dynamical Systems. NY: Wiley-IEEE; 2003.
- Azar, F. S., D. N. Metaxas, M. D. Schnall, “Methods for modeling and predicting mechanical deformations of the breast under external perturbations,” Medical image Analysis, vol. 6, pp. 1-27, 2002.
- Azar, F. S., D. N. Metaxas, M. D. Schnall, “A finite model of the breast for predicting mechanical deformations during biopsy procedure,” in IEEE Workshop on Mathematical Methods in Biomedical Image Analysis, 2000, pp. 38-45.
- Cleary, K., M. Freedman, M. Clifford, D. Lindisch, S. Onda, and L. Jiang, “Image-guided robotic delivery system for precise placement of theraputic agents,” J. Cont. Release, vol. 74, No. 1, pp. 363-368,2001.
- DiMaio, S. P., S. E. Salcudean, “Needle insertion modeling and simulation,” IEEE Transactions on Robotics and Automation, vol. 19, No. 4, Oct. 2003.
- DiMaio, S. P., S. E. Salcudean, “Needle steering and model-based trajectory planning,” in Proc. MICCAI, vol. 2878, LNCS, pp. 33-40, 2003.
- Elmore, J. G., K. Armstrong, C. D. Lehman, S. W. Fletcher, “Screening for Breast Cancer,” Journal of American Medical Association, vol. 293, No. 10, pp. 1245-1256, Mar. 2005.
- Glozman, D., M. Shoham, “Image-guided robotic flexible needle steering,” IEEE Transactions on Robotics, vol. 23, No. 3, pp. 459-467, Jun. 2007.
- Hong, J., T. Dohi, M. Hashizume, K. Konishi, N. Hata, “An ultrasound driven needle insertion robot for percutaneous cholecystostomy,” Physics in Med. and Biol., vol. 39, No. 3, 441-455, 2004.
- Krupa, A., G. Fichtinger, G. D. Hager, “Full motion tracking in ultrasound using image speckle information and visual servoing,” in Proc. IEEE International Conference on Robotics and Automation, pp. 2458-2464, 2007.
- Loser, M. H., N. Navab, “A new robotic system for visually controlled percutaneous interventions under CT fluoroscopy”, in Proc. MICCAI, vol. 1935, LNCS, pp. 887-896, 2000.
- Mallapragada, V., N. Sarkar, T. Podder, “A Robotic System for Real-time Tumor Manipulation During Image Guided Breast Biopsy,” in Proc. IEEE International Conference on Bioinformatics and Bioengineering, pp. 204-210, Oct. 2007.
- Mallapragada, V., N. Sarkar, T. Podder, “Robot assisted real-time tumor manipulation for breast biopsy,” IEEE Transactions on Robotics; Apr. 2009; 25:2; pp. 316-324.
- Mallapragada, V., N. Sarkar, T. Podder, “Autonomous Coordination of imaging and Tumor manipulation for Robot Assisted Breast Biopsy” Proc of 2nd Biennial IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics Scottsdale AZ; Oct. 19-22, 2008.
- McClamroch, N. H., “Displacement control of flexible structures using electrohydraulic servo-actuators,” Journal of Dynamic Systems, Measurement and Control, vol. 107, pp. 34-39, Mar. 1985.
- Okazawa, S., R. Ebrahimi, J. Chuang, S. E. Salcudean, R. Rohling, “Hand-held steerable needle device,” IEEE/ASME Transactions on Mechatronics, vol. 10, pp. 285-296, Jun. 2005.
- Olsson, H., K. J. Astrom, “Friction generated limit cycles”, IEEE Transactions on Control Systems Technology, vol. 9, pp. 629-636, Jul. 2001.
- Patriciu, A., S. Solomon, L. Kavoussi, and D. Stoianovici, “Robotic kidney and spine percutaneous procedures using a new laser-based CT registration method,” in Proc. MICCAI, vol. 2208, LNCS, 2001, pp. 249-257.
- Pierrot, F., E. Dombre, E. Degoulange, L. Urbain, P. Caron, S. Boudet, J. Gariepy, J. Megnien, “Hippocrate: A safe robot arm for medical applications with force feedback,” Medical Image Analysis, vol. 3, No. 3, pp. 285-300, 1999.
- Smith, W. L., K. J. M. Surry, G. R. Mills, D. B. Downy, and A. Fenster, “Three-dimensional ultrasound-guided core needle breast biopsy,” Ultrasound Med. Biol., vol. 27, No. 8, pp. 1025-1034, 2001.
- Solberg, O. V., F. Lindseth, H. Torp, R. E. Blake, T. A. N. Hernes, “Freehand 3D ultrasound reconstruction algorithms—A review,” Ultrasound in Med. and Biol., vol. 33, No. 7, pp. 991-1009, 2007.
- Stoianovici, D., L. Whitcomb, J. Anderson, R. Taylor, and L. Kavoussi, “A modular surgical robotic system for image guided percutaneous procedures,” in Proc. MICCAI, vol. 1496, LNCS, pp. 404-410, 1998.
- Stoianovici, D. et al., “A novel mechanical transmission applied to percutaneous renal access,” in Proc. ASME Dynamic Systems Control Division, DSC, vol. 61, 2001, pp. 401-406.3.
- Surampudi, B., G. Langari, “A fuzzy cancellation of stick-slip in DC motors”, Third International Conference on Industrial Fuzzy Control and Intelligent Systems, pp. 24-20, Dec. 1993.
- Tsekos, N. V., J. Shudy, E. Yacoub, P. V. Tsekos, I. G. Koutlas, “Development of a robotic device for MRI-guided interventions in the breast,” in Proc. Bioinformatics and Bioengineering Conference, 2001, 201-208.
- Vilchis, A., J. Troccaz, P. Cinquin, K. Masuda, F. Pellisier, “A new robot architecture for tele-echography,” in IEEE Transactions on Robotics and Automation, vol. 19, No. 5, pp. 922-926, 2003.
- Vitrani, M. A., G. Morel, T. Ortmaier, “Automatic Guidance of a surgical instrument with ultrasound based visual servoing,” in Proc. IEEE International Conference on Robotics and Automation, pp. 510-515, 2005.
- Wada, T., S. Hirai, S. Kawamura, N. Kamiji, “Robust manipulation of deformable objects by a simple PID feedback,” in Proc. International Conference on Robotics and Automation, 2001, pp. 85-90.
- Wellman, P., “Tactile imaging”, PhD dissertation, Division of Engineering and Applied Sciences, Harvard University, MA, 1999.
- Wunderbaldinger, P., T. H. Helbich, B. Partik, K. Turetschek, and G. Wolf, “First experience with a newdedicated ultrasound system for computer-guided large-core breast biopsy,” Eur. Radiol., vol. 11, No. 12, pp. 2460-2464, Dec. 2001.
- Masuda, K., E. Kimura, N. Tateishi, K. Ishihara, “Three-dimensional motion mechanism of ultrasound probe and its application for tele-echography system,” in Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1112-1116, 2001.
- Mallapragada, V., N. Sarkar, T. Podder, “Robot-Assisted Real-time Tumor Manipulation for Breast Biopsy,” 2008 IEEE International Conference on Robotics and Automation Pasadena, CA, May 19-23, 2008.
- Mallapragada, V., N. Sarkar, T. Podder, Robot System for Tumor manipulation and Ultrasound Image Guidance during breast biopsy; 30th Annual International IEEE EMBS Conference Vancouver BC CA Aug. 20-24, 2008.
- Mallapragada, V., “Towards a Robot Assisted Breast Biopsy System” Dissertation.
Type: Grant
Filed: May 19, 2008
Date of Patent: Nov 29, 2011
Patent Publication Number: 20080287827
Assignee: Vanderbilt University (Nashville, TN)
Inventors: Nilanjan Sarkar (Brentwood, TN), Tarun Kanti Podder (Rochester, NY), Vishnu Mallapragada (Nashville, TN)
Primary Examiner: Long V. Le
Assistant Examiner: Nicholas Evoy
Attorney: Stites & Harbison PLLC
Application Number: 12/123,247
International Classification: A61B 8/00 (20060101);