Patient positioning assembly for therapeutic radiation system
An apparatus and method for moving a support device with respect to a radiation source in at least three degrees of freedom to align a treatment target with respect to the radiation source.
This is a continuation application of U.S. application Ser. No. 10/687,860, filed on Oct. 17, 2003, which is hereby incorporated by reference.
TECHNICAL FIELDThe present invention relates to a patient positioning assembly for therapeutic radiation systems.
BACKGROUNDThe term radiosurgery refers to a procedure in which intense and precisely directed doses of radiation are delivered to a target region in a patient, in order to destroy tumorous cells or otherwise treat the target region. The term radiotherapy refers to a procedure in which radiation is applied to a target region for therapeutic, rather than necrotic, purposes. The amount of radiation utilized in radiotherapy is typically about an order of magnitude smaller, as compared to the amount used in radiosurgery. Radiotherapy is frequently used to treat early stage, curable cancers. For convenience, the term “radiosurgery” in this application shall henceforth mean “radiosurgery and/or radiotherapy.”
In radiosurgery, it is necessary to determine with precision the location of the target region (and surrounding critical structures) relative to the reference frame of the treatment device. It is also necessary to control the position of the radiation source so that its beam can be precisely directed to the target tissue while avoiding surrounding healthy tissue, with control of propagation in and through other body structures.
To effect such beam position control, a frameless stereotactic radiosurgery system has been developed, which implements image-guided radiosurgery using a robot. An image-guided robotic system provides the requisite beam position control for accurate delivery of therapeutic radiation, while eliminating the need for rigid stereotactic frames. Such image-guided robotic systems typically include a treatment beam generator, for example an x-ray source, mounted onto a robot, and a controller. The x-ray source provides precisely shaped and timed radiation beams. Using pre-treatment scan data, as well as treatment planning and delivery software, the controller acquires information regarding the pre-treatment position and orientation of the treatment target region. The patient is usually placed on a support device, such as a couch or a table. During treatment, an imaging system repeatedly measures the position and orientation of the target relative to the x-ray source. Prior to the delivery of radiation at each delivery site, the controller directs the robot to adjust the position and orientation of the x-ray source, in accordance with the measurements made by imaging system, so that the requisite dose of the treatment beam can be applied to the treatment target within the patient.
While operating these image-guided robotic systems, it is necessary to adjust the position and orientation of the patient in order to ensure that the target within the patient remains properly aligned with respect to the treatment beam. The position and orientation of the patient must be corrected, for example, in order to compensate for any motion (such as respiratory motion, sneezing, or shifting) that the patient may undergo during treatment.
Accordingly, it is desirable to provide a patient positioning assembly that includes a dynamic motion control mechanism for controlling the motion of the support device, so that the position and orientation of the support device can be adjusted as necessary.
SUMMARY OF INVENTIONA method is provided for moving a support device with respect to a radiation source in at least three degrees of freedom to align a treatment target with respect to the radiation source. The support device may be used in a patient positioning assembly used in connection with therapeutic radiation systems that include the radiation source. The patient positioning assembly may make an initial correction of the support device to align the treatment target with respect to the radiation source, and then take one or more additional images of the treatment target. The patient positioning assembly performs one or more additional corrections based on the one or more additional images to align the treatment target with respect to the radiation source. Taking additional images and performing one or more additional corrections may be repeated until a specific amount of images have been taken or the residual corrections to align the treatment target with respect to the radiation source falls below a specified limit.
BRIEF DESCRIPTION OF THE DRAWINGSThe present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings.
A patient positioning assembly is provided for adjusting patient position and orientation during therapeutic radiation treatment. In one embodiment, the patient positioning assembly is used with a frameless, image-guided radiosurgery system, such as the Cyberknife system developed by Accuray, Inc., although in other embodiments the patient positioning assembly can be used with other types of therapeutic radiation systems.
The imaging system generates, in near real time, x-ray images showing the position and orientation of the target in a treatment coordinate frame. The controller 18 contains treatment planning and delivery software, which is responsive to pre-treatment scan data CT (and/or MRI data and/or PET data and/or ultrasound scan data) and user input, to generate a treatment plan consisting of a succession of desired beam paths, each having an associated dose rate and duration at each of a fixed set of nodes. In response to the controller's directions, the robot 12 moves and orients the x-ray linac 14, successively and sequentially through each of the nodes, while the x-ray linac 14 delivers the required dose as directed by the controller 18. The pre-treatment scan data may include, for example, CT scan data, MRI scan data, PET scan data, and ultrasound scan data.
Prior to performing a treatment on a patient with the CyberKnife, the patient's position and orientation within the frame of reference established by the CyberKnife's x-ray imaging system 16 must be adjusted to match the position and orientation that the patient had within the frame of reference of the CT (or MRI or PET) scanner that provided the images used for planning the treatment. It is desirable that this alignment be performed to within tenths of a millimeter and tenths of a degree for all six degrees of freedom.
In one embodiment, the controller 130 controls the motion of the x-ray source (shown in
In one embodiment, the patient positioning assembly 100 further includes at least one user interface 140, including one or more user interface units that enables a user or operator to interactively participate in controlling the motion of the support device.
In the illustrated embodiment, the support device 110 is a treatment table, although in other embodiments other types of support devices (such as a chair or bench) may be used. A communications link (not illustrated) between the controller 130 and the table 110 enables communications between the table 110 and the controller 130. The communications link can be a wired link or a wireless link, with a bandwidth necessary for maintaining reliable and timely communications.
One or more table position sensors 150 are provided to sense the position of the table 110. One or more table motion actuators 160 are provided for moving the table, in accordance with directions from the controller 130. A table interface module 120 allows the table to interface with the sensors 150, the actuators 160, the controller 130, and the user interface 140. In the illustrated embodiment, the table interface module 120 is an electronics module embedded within the table. The table interface module 120 manages communications between the table 110, and the user interface 140 and the controller 130, accepting motion commands and providing position feedback and other status messages. The electronics module 120 can independently check table positions against a model of surrounding obstructions to ensure that the table doesn't collide with any obstacles during table motion. The module 120 could be a retrofit item or be a functionality designed as part of the table's original design requirements.
The controller 130 includes an input module 231 for receiving 1) pre-treatment scan data representative of pre-treatment scans of the target, and 2) near real time image data representative of near real time images of the target. The pre-treatment scans show the position and orientation of the target with respect to the pre-treatment coordinate system. The near real-time images, taken by the imaging system under the command of the controller, show the position and orientation of the target with respect to the treatment coordinate system. The treatment coordinate system and the pre-treatment coordinate systems are related by known transformation parameters. The controller includes a TLS (target location system) processing unit that computes the position and orientation of the target in the treatment coordinate system, using the pre-treatment scan data, the near real time image data, and the transformation parameters between the pre-treatment coordinate system and the treatment coordinate system.
The controller 130 includes a comparator 232, or other software for comparing the position and orientation of the target, as shown in the near real-time image data, with the position and orientation of the target as shown in the pre-treatment scan data. The controller computes the amount of translations (in three degrees of freedom) and rotations (in three degrees of freedom) that are required in order for the position and orientation of the target, as shown in the near real time images, to substantially match the position and orientation of the target, as shown in the pre-treatment scans. The controller 130 includes software for converting this information into one or more units of motion of the table, in at least three degrees of freedom, and preferably in five or six degrees of freedom. The controller 130 includes a signal generator 233, or other software for generating at least one motion command signal for implementing corrective motions of the table, which align the treatment target within the patient with respect to the radiosurgery system in such a way that the position and orientation of the target, as shown in the near real time images generated by the imaging system, substantially match the position and orientation of the target as shown in the pre-treatment scans.
In one embodiment, the controller 130 controls the motion of the x-ray source 14, as well as the motion of the table 110. In other words, the controller 130 controls the relative motion of the table 110, with respect to the robot-implemented motion of the x-ray source 14. In this way, the corrective motions of the table, implemented by the motion command signal from the controller 130, compensates for one or more motions of the x-ray source implemented by the robot.
This feature is useful, for instance, when there are practical limits on where the robot can place the x-ray source, and how the robot can implement the requisite movements of the x-ray linac, in order to deliver a desired radiation pattern to the target. There may be restrictions on where the x-ray source can move to, because of the configuration of the radiosurgery system. In such cases, the patient positioning assembly 100 is capable of dynamically controlling the motion of the support device 110, so as to implement any trade-off motions that are necessary for correctly aligning the patient relative to the treatment beam, and for delivering the correct radiation pattern to the target. In one embodiment, the combination of the motions of the table 110 and the motions of the x-ray linac 14, are dynamically coordinated and controlled, so as to maximize the workspace available to the radiosurgery system.
In one embodiment, the corrective motions of the table 110, implemented by the motion command signals generated by the controller 130, compensate for various patient motions that the patient may undergo, during treatment. These patient motions may include, but are not limited to, the following: respiratory motion; cardiac pumping motion of the patient's heart; sneezing, coughing, or hiccuping; and muscular shifting of one or more anatomical members of the patient.
The table 110 is capable of motion in at least three degrees of freedom, namely three translational degrees of freedom (x-, y-, and z-). Preferably, the table is capable of motion in all six degrees of freedom, namely three translational degrees of freedom plus three rotational degrees of freedom (roll-, pitch-, and yaw-rotations). The motion command signal, generated by the controller, thus controls corrective motions of the table in at least three, and preferably five or six, degrees of freedom.
In one embodiment, the table 110 is capable of motion in five degrees of freedom (three translational, plus roll- and pitch-rotations), and an external device such as a robot is used for correcting for yaw-rotations. In this embodiment (not illustrated), the controller includes software for performing a number of functionalities for correcting yaw rotation, using the robot. These functionalities include TLS functionality to verify if any workspace (reachability, patient proximity, or image blocking) errors are encountered when attempting to correct for yaw over the maximum range of yaw to be corrected. The software zeroes out any node for which the yaw correction can lead to a workspace error. Also included is robot functionality to utilize the rotations reported by the target locating system and to correct for them.
The software includes a functionality for standardizing the robot motion from node to node by moving with no corrections, such that robot motion errors are minimized during treatment delivery. The correction of yaw error can be extended to the other two rotations, roll and pitch, with greater reduction in workspace. In one embodiment, the external device (robot) may correct for any of the 5 degrees of freedom, as well as for the sixth degree of freedom.
The controller 130 includes software for establishing and maintaining a reliable communication interface with the table over a bus interface, for example the Ethernet-to-Serial-to-CAN (Controller Area Network) bus interface. The software uses the interface specifications developed for the table 110. The controller 130 further includes software for converting the patient position and orientation information from the imaging system to appropriate units of movement in the 5 degrees of freedom of motion capability of the table. The algorithms are scalable to use all 6 degrees of freedom. The controller further includes software for providing a user interface to on the CyberKnife user control console, to inspect and initiate the table motion to position the patient. The controller further includes software for detecting, reporting and handling errors in communication or software control of the table.
The user interface 140 effects computer control of the three, five or six degrees of freedom of the table 110. In a particular embodiment, the user interface 140 includes: a bus interface for connecting the table to the Cyberknife primary workstation; at least one user interface unit for allowing the user to interface with the controller to interactively control the table motion; and a hardware interface to the Cyberknife E-stop (emergency stop) circuitry. The bus interface may be a CAN bus interface that can be connected to the Cyberknife primary workstation using Ethernet-to-Serial-to-CAN bus converters. The user interface unit can be a secondary unit on such a CAN bus. The hardware interface to the E-stop circuitry disables to computer-controlled table motions when any E-stop is engaged.
The E-stop mechanism is operable to stop computer-controlled motion of the table 140. In one embodiment, the “System E-stop” is an emergency lockout mechanism, capable of shutting down any and all radiation, and any and all motion. In other words, the “System E-stop” shuts down at least the following: 1) generation of therapeutic x-ray beams by the x-ray source; 2) any motion of the x-ray source and/or the robot; 3) any motion of the table; and 4) the imaging system.
The user interface allows the user or operator to interactively participate in controlling the motion of the table, by implementing one or more user-selectable functions. These user-selectable functions include, but are not limited to, the following:
1) a function that allows the user to activate the x-ray imaging system, so that the acquisition of near real time images of the target can be initiated; 2) a function for allowing the user to move the table to a pre-programmed “HOME” position, which corresponds to a mounting position that facilitates the mounting of the patient onto the table; 3) a function for allowing the user to move the table to a pre-programmed “TREAT” position, which is the default treatment position; 4) a function for displaying to the user the three translations and two rotations corresponding to the table corrective motions needed to adjust the target position, in accordance with the information from the near real time images; 5) a function for allowing the user to compare the translations and rotations with respective pre-specified limits for each translation and rotation; 6) a function for allowing the user to modify one or more of the pre-specified limits; and 7) a function for allowing the user to verify that the translations and rotations fall below the pre-specified limits, and thereupon activate the x-ray source to initiate treatment delivery.
In a particular embodiment, the user interface unit is a remote control unit that provides a user with remote control capabilities for remote control of the motion of the support device 110.
In the illustrated embodiment, the handheld remote control unit 200 includes seven motion switches: five sets of axes motion control switches 210A-210E, a home switch 220, and a treat switch 230. The axes motion control switches provide bi-directional manual control of each degree of freedom via a pushbutton. The axes motion control switches cause movement of the desired axes (three translational axes: left/right (210A), posterior/anterior (210B), inferior (towards the feet)/superior (towards the head) (210C); 2 rotational axes: roll left/right (210D); head down/up (210E)) in the desired direction, as long as the switch is held down and motion is disabled. The home switch 220 initiates a programmed motion, if motion is enabled, that causes the table to automatically move to the fully retracted, fully lowered, and centered position without any further operator action. The treat switch 230 initiates a programmed motion, if motion is enabled, that causes the table to move to a position defined by the treatment computer and previously downloaded to the table.
The remote control unit 200 also includes a pair of status indicators 240 and 242, which are LEDs (light emitting diodes) that provide an indication of whether motions are enabled and being accepted. In the illustrated embodiment, the E-stop LED 240 is yellow when System E-stop is asserted, green when overridden by bypass switches, and off when no System E-stop is asserted. The MOVE LED 242 is green whenever a switch is pushed and motion is enabled, flashing green when a programmed movement is occurring, and yellow when the table E-stop is engaged.
The remote control unit 200 also includes a pair of motion enable switches 250. Depressing both switches enables all motion switches (axes motion control, home and treat), and overrides the System E-stop, if present, although it does not override table E-stop switches. Releasing one or both of the switches while a programmed motion is occurring will cause that motion to stop.
The remote control unit 200 may also include a Goto switch (not shown), allowing the user to access stored locations. The remote control unit 200 may also include display capabilities (not shown), for example to display to the user the three translations and two rotations, or to display informational messages to the user. The remote control unit 200 may also include absolute and relative position display/input modes (not shown).
One or more user interface screens on the user control console of the primary Cyberknife workstation, allows the user to inspect, initiate, and interactively control the table motion to position the patient.
In the illustrated embodiment, an ALIGN COUCH button in the treatment delivery screen 400 launches the user interface screen 300. The user interface screen 300 includes a number of fields, with different functions. These fields include translation and rotation fields, which are initially filled with the table corrective motions returned by the TLS unit of the controller. If no valid table corrective motions are available, these fields are left blank. The translation and rotation fields are editable.
In the illustrated embodiment, the user interface screen 300 includes a MOVE button 310, an “AUTO ALIGN” button 320, and a “CANCEL” button 330. The “MOVE” button 310 moves the table by the amount of translations and rotations indicated. If the “Apply rotation” field is unchecked, the table is moved only in translational units. The “AUTO ALIGN” button 320 initially moves the table by the amount of translations and rotations indicated, and proceeds to acquire images and correct table positions automatically until pre-specified “Auto align limits” are satisfied. This means that the translations and rotations are below the pre-specified limits, or the number of images indicated are taken. The “Auto align limits” fields are filled in from a system configuration file, but can be edited. The “CANCEL” button 330 will return to the Patient Alignment interface.
In one embodiment, the user interface screen 300 includes button icons that allow the user to adjust imaging parameters, such as the intensity, energy, and spectral distribution of the x-rays in the imaging beams generated by the imaging system; the number of near real time images to be acquired; the selection and de-selection of fiducials; and rigid body parameters.
In operation, an approximate treatment location for the patient is computed, as part of the treatment planning process. When the treatment plan is loaded into the controller, the approximate treatment location is downloaded into the treatment table. The operator positions the patient on the table, and applies any restraining devices. The operator then presses the “Treat” button in the handheld user interface unit 200 (shown in
The operator then exits the treatment room and using the user interface screen 300 (shown in
After obtaining a satisfactory alignment, the radiosurgery system is commanded to begin treatment. As part of the treatment, near real time images are obtained periodically by the imaging system, to ensure that the patient doesn't move during the treatment. If the patient does move, the operator can cause treatment delivery to be paused, and the patient to be realigned, by effecting appropriate corrective motions of the table. At the conclusion of the treatment, the operator reenters the treatment room and uses the “Home” button on the handheld user interface unit to return the table to the position for patient unloading. Alternatively, the system could issue the Home command from the computer screen.
Following is a more detailed description of the operation of the patient positioning assembly described above. During the initial treatment planning phase, the treatment planning system checks for the workspace-related issues that result from the attempt to correct for the patient yaw over the adjustable range. The adjustable range of yaw is specified in a data file. Any radiation nodes that encounter a workspace-related issue as a result of attempting to correct patient yaw over the adjustable range have the dose set to zero.
The next stage is the initial patient set-up stage. During this stage, the treatment planning files are downloaded, prior to patient entry into the treatment room. During the download of treatment files, the treatment position of the table is downloaded into the controller and then to the table interface module. The treatment position of the table is one of: a) a default table position for the beam path set selected; and b) a treatment position for the patient, the last time the same plan was used. Before the patient walks into the treatment room, the “HOME” button on the handheld remote control unit is pressed, so as to position the table in a pre-defined comfortable position for the patient to get onto the table. The patient is then immobilized, for example using aqua masks and or other immobilization devices.
The “TREAT” key on the handheld remote control unit is used to position the table to the nominal treatment position. For head treatments, or if this is a second or subsequent treatments for the patient with the same plan, the nominal treatment position is adequate for further automatic positioning, and the operator can proceed to the user control console for automatic positioning of the patient. Otherwise, the table is further manually adjusted, using the handheld remote control unit, so that the anatomical target region of interest is within the imaging field of view. The operator then proceeds to the user control console, for automatic positioning of the patient.
The next stage is the initial image acquisition stage. During this stage, the operator acquires images, using the ACQUIRE button on the patient alignment screen in the user interface screen 300 (shown in
The next stage is the one-time table alignment stage. The user selects the “AUTO COUCH” button on the patient alignment screen. This brings up a Couch Adjustment interface screen, which contains the initial corrections obtained from the TLS unit of the controller. The initial corrections from TLS are editable. The “MOVE” button moves the table by the amount of corrections indicated in the window. The option to disable rotation corrections are available. The “AUTO ALIGN” button perform the first correction, and proceeds to complete the automatic alignment.
The next stage is the automatic table alignment stage. The “AUTO ALIGN” button in the Couch Adjustment interface screen performs the automatic alignment. Auto Align starts by making the initial correction in the Couch Adjustment interface, and proceeds to take additional images and perform the correction from the image, until one of the following conditions are met: the desired number of images in the Auto Alignment phase are acquired, and/or the residual corrections fall below the limits specified in the Auto Alignment interface.
The next stage is the patient re-alignment stage. Patient re-alignment is entered whenever the system encounters a recoverable error (including operator pause), and the system is resumed from this state. Patient re-alignment is handled the same way as patient alignment. In other words, after the initial acquisition, further adjustments can be done automatically using the “AUTO ALIGN” button in the Couch Adjustment interface.
The final stage is the treatment delivery stage. Treatment delivery is initiated when the corrective motions for the table fall below pre-specified limits for translations and rotations. The limit for patient yaw is larger than those for other rotations, and are corrected by the robot. The corrective motions downloaded to the robot includes translations and the specified set of rotations. The robot moves to the nominal position for the node, correct by the specified translation and rotation, and then enable the x-ray beam generator. At the end of dose delivery for the node, the robot returns to the nominal position, (i.e. zero rotations and translations), and proceed to the next node in this nominal position.
The controller includes software for error detection, reporting, and correction. In one embodiment, the error handling software includes “operator pause” functionality. This functionality allows the user to stop image acquisition, if one is in progress, and return to a target alignment or realignment mode. The user can also stop the table motion, if one is in progress, and return to the target alignment/realignment mode. The user can also stop subsequent image acquisitions and table motions, if the “auto alignment” mode is in progress.
In one embodiment, the error handling software also includes a functionality for handling TLS (target locating system) errors. Appropriate TLS errors, such as soft algorithm errors, and/or E-stop for hardware errors, are reported. Upon acknowledgement of the error, the controller can return to the alignment or re-alignment state. The user can stop subsequent image acquisitions and table motions, if “auto alignment” is in progress. During the initial alignment, the “patient out of bounds” error is disabled, but the “TREAT” button is disabled until the patient is within bounds.
In one embodiment, the error handling software includes a functionality for handling table interface errors. Table interface errors such as communication errors are handled as soft errors, which require user acknowledgment, but do not engage an E-stop. In one embodiment, the error handling software includes a functionality for handling E-stop. In this embodiment, an E-stop stops computer controlled table motion, using a dual redundant mechanism. The controller software stops generating any further motion command signals. The table controller hardware is disabled from table movement when an E-stop is engaged. Even when the E-stop is engaged, the table is capable of moving using the handheld user interface unit. On resumption from pause or a recoverable E-stop, the E-stop is cleared by system reset from the operator console, which then goes into a patient re-alignment state. At this stage, the user can use auto-align to refine the patient position. The RESUME button on the patient re-alignment screen enables resumption of treatment delivery.
Many other embodiments are possible. For example, the patient positioning assembly described above can be used with therapeutic radiation systems other than Cyberknife. The controller software may include functionalities other than those described above.
Other embodiments are within the following claims.
Claims
1. A method, comprising:
- providing a support device;
- providing a radiation source; and
- moving the support device with respect to the radiation source in at least three degrees of freedom to align a treatment target with respect to the radiation source.
2. The method of claim 1, wherein moving the support device comprises:
- determining a near real-time position of the treatment target with respect to a treatment coordinate system; and
- determining one or more corrective motions of the support device to move the support device from the near real-time position with respect to the radiation source to align the treatment target with respect to the radiation source.
3. The method of claim 2, wherein moving the support device comprises determining a position of the treatment target with respect to a pre-treatment coordinate system, the treatment coordinate system having a predetermined relationship to the pre-treatment coordinate system, wherein determining the one or more corrective motions comprises moving the support device with respect to the radiation source to substantially match the position of the treatment target with respect to the pre-treatment coordinate system with the determined near real-time position with respect to the treatment coordinate system.
4. The method of claim 3, wherein moving the support device comprises:
- determining an orientation of the treatment target with respect to the pre-treatment coordinate system; and
- determining an orientation of the treatment target with respect to the treatment coordinate system, and wherein determining the one or more corrective motions comprises moving the support device with respect to the radiation source to substantially match the position and orientation of the treatment target with respect to the pre-treatment coordinate system with the determined near real-time position and orientation of the treatment target with respect to the treatment coordinate system.
5. The method of claim 1, wherein moving the support device comprises:
- determining an orientation of the treatment target with respect to a treatment coordinate system; and
- determining one or more corrective motions of the support device to move the support device with respect to the radiation source using the orientation of the treatment target to align the treatment target with respect to the radiation source.
6. The method of claim 5, wherein moving the support device comprises determining an orientation of the treatment target with respect to a pre-treatment coordinate system, the treatment coordinate system having a predetermined relationship to the pre-treatment coordinate system, wherein determining the one or more corrective motions comprises moving the support device with respect to the radiation source to substantially match the orientation of the treatment target with respect to the pre-treatment coordinate system with the determined orientation with respect to the treatment coordinate system.
7. The method of claim 3, further comprising:
- receiving pre-treatment scan data representative of one or more pre-treatment scans of the treatment target within a patient on the support device, the one or more pre-treatment scans showing the position of the treatment target with respect to the pre-treatment coordinate system;
- receiving near real-time image data representative of one or more near real-time images including the position of the treatment target with respect to the treatment coordinate system; and
- generating at least one motion command signal for implementing the one or more corrective motions of the support device to move the support device to substantially match the position of the treatment target as shown in the pre-treatment scan data of the treatment target with the position of the treatment target of the near real-time image data.
8. The method of claim 7, wherein generating the at least one motion command signal comprises comparing the position of the treatment target, as shown in the near real-time image data, with the position of the treatment target as shown in the pre-treatment scan data.
9. The method of claim 7, wherein the one or more pre-treatment scans shows the position and an orientation of the treatment target with respect to the pre-treatment coordinate system, wherein the one or more near real-time images show the position and an orientation of the treatment target with respect to the treatment coordinate system, and wherein generating the at least one motion command signal comprises moving the support device to substantially match the position and orientation of the treatment target as shown in the pre-treatment scan data with the position and orientation of the treatment target of the near real-time image data.
10. The method of claim 9, wherein generating the at least one motion command signal comprises comparing the position and orientation of the treatment target, as shown in the near real-time image data, with the position and orientation of the treatment target as shown in the pre-treatment scan data.
11. The method of claim 1, wherein moving the support device comprises positioning the support device to a treatment position.
12. The method of claim 11, wherein the treatment position is where the treatment target is within an imaging field of view of an imaging system.
13. The method of claim 11, further comprising manually adjusting the support device so that the treatment target is within an imaging field of view of an imaging system.
14. The method of claim 13, wherein manually adjusting the support device comprises using a handheld remote control unit.
15. The method of claim 1, wherein moving the support device comprises:
- making an initial correction of the support device to align the treatment target with respect to the radiation source;
- taking one or more additional images of the treatment target; and
- performing one or more additional corrections from the one or more additional images to align the treatment target with respect to the radiation source.
16. The method of claim 15, wherein taking one or more additional images and performing one or more additional corrections are repeated until at least one of the following conditions is met:
- a specific amount of images have been taken; or
- residual corrections to align the treatment target with respect to the radiation source fall below a specified limit.
17. The method of claim 15, further comprising positioning the support device so that the treatment target is within an imaging field of view of an imaging system before imaging the treatment target at the initial treatment position.
18. The method of claim 7, wherein generating the at least one motion command signal further comprises:
- calculating an amount of translations and rotations required in order for the position of the treatment target, as shown in the near real-time images, to substantially match the position of the treatment target, as shown in the pre-treatment scan data; and
- converting the amount of translations and rotations into one or more units of motion of the support device.
19. The method of claim 7, wherein moving the support device comprises moving the support device with respect to the radiation source using at least one actuator.
20. The method of claim 7, wherein moving the support device comprises moving the support device with respect to the radiation source using an external device, wherein the external device is a robot, and wherein the robot includes an articulated arm assembly.
21. The method of claim 7, further comprising providing a user interface coupled to the support device, the user interface including one or more user-selectable functions.
22. The method of claim 21, wherein the one or more user-selectable functions comprise at least one of:
- activating an imaging system so as to initiate the acquisition of the one or more near real-time images of the treatment target;
- adjusting one ore more imaging parameters of the imaging system;
- moving the support device to at least one of: a first pre-programmed position corresponding to a loading position for mounting the patient onto the support device; and a second pre-programmed position corresponding to a treatment position in which the patient was treated at a time period prior to a current treatment;
- displaying to the user a sequence of translations and rotations corresponding to the one or more corrective motions of the support device for moving the support device with respect to the radiation source;
- modifying the sequence of translations and rotations;
- comparing the translations and rotations with respective pre-specified limits for each translation and rotation;
- modifying one or more of the pre-specified limits; or
- activating a treatment beam generator of a treatment apparatus, having the radiation source, to initiate treatment delivery, upon verification that the translations and rotations identified by the motion command signal fall below the pre-specified limits.
23. The method of claim 22, wherein the imaging parameters comprise at least one of:
- an intensity of x-rays in one or more imaging beams generated by the imaging system;
- a spectral distribution of the x-rays in the imaging beams;
- energy of x-rays in imaging beam;
- selection and de-selection of fiducials;
- one or more rigid body parameters; or
- a number of near real-time images to be acquired.
24. The method of claim 21, wherein the user interface interactively controls the position of the support device for substantially aligning the position and orientation of the treatment target as shown in the pre-treatment scan data of the treatment target.
25. An apparatus, comprising:
- a support device; and
- means for moving the support device with respect to a radiation source in at least three degrees of freedom to align a treatment target with respect to the radiation source.
26. The apparatus of claim 25, further comprising means for generating one or more corrective motions of the support device to move the support device to substantially match a position and orientation of the treatment target as shown in a pre-treatment scan of the treatment target with a position and orientation of the treatment target in a near real-time image data.
Type: Application
Filed: Jun 29, 2006
Publication Date: Nov 2, 2006
Inventors: Eric Earnst (Saratoga, CA), Gopinath Kuduvalli (San Jose, CA), Vladimir Mitrovic (Foster City, CA), Matthew Core (Mountain View, CA)
Application Number: 11/478,753
International Classification: A61N 5/10 (20060101);