SYSTEM AND METHOD FOR DISPLAYING ANATOMY AND DEVICES ON A MOVABLE DISPLAY

An image display system is provided comprised of a virtual window system that creates a visual coherency between the patient's anatomical images and the actual patient by aligning the image on the display to the patient and then presenting the image to the user in a way that feels as if the user is looking directly into the patient through the display. The image shown within the image display system is dependent upon the position of the image display apparatus and the position of the user so that the display orientation of the image may be biased slightly toward the user to improve ergonomics and usability.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit as a continuation of U.S. application Ser. No. 14/286,793, filed May 23, 2014, entitled “System and Method for Displaying Anatomy and Devices on a Movable Display,” which claims the benefit of U.S. Provisional Application No. 61/829,078, filed May 30, 2013, entitled “System and Method for Displaying Anatomy and Devices on a Movable Display,” which is hereby incorporated by reference in its entirety.

BACKGROUND 1. Field of the Invention

The invention relates generally to the diagnosis and treatment of disorders using minimally invasive techniques. In many minimally invasive procedures very small devices are manipulated within the patient's body under visualization from a live imaging source like ultrasound, fluoroscopy, or endoscopy. Live imaging in a minimally invasive procedure may be supplemented or replaced by displaying the position of a sensored medical device within a stored image of the patient anatomy.

Many minimally invasive procedures are conducted in expensive settings by specialized physicians. Often small, percutaneous medical devices are visualized during the procedure by using live fluoroscopic or ultrasonic imaging. While the live imaging provides a real-time image of anatomy, it has many drawbacks:

Time spent in an imaging suite is expensive and raises the cost of many minimally invasive medical procedures.

Ionizing radiation used to create the fluoroscopic image is dangerous to the patient, physician, and assistants.

Needles, Guidewires, and other small devices may be difficult to locate within the live two-dimensional image. These devices may be too small to see clearly in fluoroscopic images. In ultrasound images, these devices may be difficult to locate when they are outside of the ultrasonic imaging plane or they may reflect a diffused, ambiguous image when they are within the ultrasonic imaging plane.

The fluoroscopic and ultrasonic images are two-dimensional and do not provide determinant information about motion of the medical device and three-dimensional anatomical structures.

During a typical minimally invasive procedure the physician must look away from the patient and his or her hands to see the display showing the live image. Additionally, the frame of reference for the live image is typically misaligned from the frames of reference for the physician, the tool and the patient. This presents a challenging situation for the physician who must compensate for differences in these frames of reference. For instance, when the physician inserts a device into the patient by moving his hands from left to right, the fluoroscopic image of the device moves towards the top of the display. Ultrasonic images can be even more confounding in that the frame of reference for the ultrasound image is based on the position and orientation of the ultrasound probe which is frequently moving during imaging. The physician must compensate for the misalignment of the coordinate systems for the respective frames of reference while also concentrating on achieving the goals of the minimally invasive procedure. The physician's need to look away from the patient and his or her instrument creates an ergonomic challenge in addition to this mental challenge. As a result the completion of minimally invasive procedures becomes delayed, increasing the procedure cost.

Prior to a minimally invasive catheter procedure, patients often have an anatomical image created using CT or MR imaging systems commercially provided by companies like Philips, Siemens, General Electric, and Toshiba. The anatomical images can be processed, or “segmented,” into three-dimensional representations of the anatomy of interest. Individual organs, muscles and vasculature can be visually separated from other anatomy for even clearer viewing of regions of interest. In this invention the three-dimensional pre-procedure images may be used instead of or in addition to live imaging for navigation during the procedure because the position and orientation of the medical device can be sensed in real-time. For example, navigation systems provided by Medtronic. GE, and Stryker sense the positions of medical devices within the patient's body and present the sensed position data in a pre-procedural image of the patient's anatomy. These navigation systems provide a supplement or replacement to fluoroscopic imaging so that the physician may conduct a minimally invasive procedure within the patient's body using little or no X-ray. However, the navigation systems do not provide a means for making the physician's hand motions on the medical device match the motions of the device displayed in the image of the anatomy on the display. In order to make minimally invasive procedures easy and intuitive, the coordinate systems of the patient, the device, the display, and the physician's hands must be unified.

Minimally invasive procedures w % here a medical device is inserted into the body are especially well suited for a system that provides navigation assistance by unifying the physician, patient, display, and device coordinate systems. These procedures usually employ devices that are navigated through the body to small anatomical targets. For example, to obtain a tissue biopsy of a prostate, a physician may insert a small catheter through the urethra into the bladder. The urethral catheter provides an ideal location for the placement of sensors that can be used by software to match the live three-dimensional shape of the urethra to the stored three-dimensional shape of the urethra in the pre-operative image set. This “registration” of the real-time position of the patient's soft tissue to the pre-operative image of the same tissue allows the tissue and adjacent tissue structures to be accessed using the pre-operative images. Then a biopsy needle may be inserted into biopsy targets within the prostate by a physician who is navigating the needle using a three-dimensional image of the prostate. Once target tissue is reached with a needle, it may be treated directly with therapies like RF ablation, cryo-therapy, brachy-therapy or chemo-embolozation. Similar use of the invention may be made for other tissues like breast, liver, lung

Endoscopic device use may similarly be improved by displaying an anatomical image that is aligned to the patient. Prior to inserting the endoscope, it is difficult to know the exact locations of anatomical structures within the body. After the endoscope is inserted, the external references of the patient's body are lost. Displaying an anatomical image that is aligned to the patient's body provides context by unifying the external view of the patient with the internal view of the anatomy, allowing the physician to choose optimal placement of access ports and improving the ability access desired anatomy quickly and directly.

Robotic surgical procedures may be improved to displaying the projected workspaces of robotic devices on an anatomical image that is aligned to the patient. The projected path, workspace, and collision space of robotic devices may be overlaid on the anatomical image and viewed from different perspectives by moving the display, allowing the user to optimize the placement of the devices in the patients body for reaching specific target anatomies.

The present invention improves the ease and reliability of visualizing anatomy within a patient by providing a system for displaying the device and patient anatomy in a substantially aligned manner.

2. Description of Background Art

Relevant references include US2010/295931; US2010/05315; US2010/039506; US2009/322671; U.S. Pat. Nos. 7,880,739; 7,203,277; 5,808,665; 7,774,044; 5,134,390; 6,038,467; and Nikou C, DiGioia A M, Blackwell M., et al. Augmented reality imaging technology for orthopaedic surgery. Operative Techniques in Orthopaedics. 2000; 10:82-86.

SUMMARY

The invention comprises a virtual window system that creates a visual coherency between the patient's anatomical images and the actual patient by aligning the image on the display to the patient and then presenting the image to the user in a way that feels as if the user is looking directly into the patient through the display. The invention is designed to also display medical devices, such as a biopsy needle. The invention makes the anatomy and the motion of the minimally invasive medical device in the display match the motion of the physician's hands by substantially unifying the coordinate systems of the patient, the medical device, the display, and the physician's hands. The invention creates a visual coherency between the motion of the medical device in the image and the motion of the physician's hands manipulating the device. This invention also creates a visual coherency between the motion of the image in the display and the motion of the display. For example, the invention shows the image of the anatomy, the projected path of the biopsy needle, and the actual location of the tip of the biopsy needle in a single image that is shown on a display over the patient in substantial alignment to the patient's actual anatomy.

Embodiments of the invention possess inventive design elements that provide excellent user ergonomics and increase the functional anatomical workspace of the virtual window surgical system. Coupling the position and orientation of the display to the image allows the image to remain aligned to the patient for various positions and orientations of the display. To improve the ergonomics and workspace of the system, the knowledge of the general position of the user relative to the patient is leveraged to slightly bias the image position to an optimized position. For example, if the user is on the left side of the patient, the image may be angled fifteen degrees away from the user so that when the display is angle fifteen degrees toward the user, the image will appear flat relative to the patient. Practice has shown that the intuitive benefits to the user of an aligned image may still be captured when small angular offsets are in place, with offsets of 30 degrees being the well-tolerated limit in many procedures. The system uses the knowledge of the user's position to bias the display toward more comfortable positions. The knowledge of the user's position may be input to the system by the user, inferred by the system using the position of the display, or sensed by the system using position or contact devices on the system. To further increase the workspace of the system, this invention allows for decoupling the relationship to reposition the display independently of the image. For instance, an aligned display may interfere with other equipment during some portion of the procedure and it may be desirable to reposition the display slightly to relieve the interference. Additionally this invention allows for a scaled coupling for improved ergonomics. For instance, moving the display with a unity ratio may cause the display to interfere with other equipment during some portion of the procedure or may make the screen difficult to view. Up to a 1.5:1 scale may be used to optimize the ergonomics of the system while maintaining the visual coherency between the patient and the image. It should be noted that the display may be repositioned along multiple axes and in multiple directions and that the scaling may be different for different axes and directions. For example, the scaling may be unity in the translational axes and 1.3:1 in the rotational axes.

Additionally this invention provides a movable support structure to hold a display directly in front of the physician, between the physician and the patient. Ideally the images are presented in a fashion such that the images are substantially aligned with the patient. This invention details the methods and techniques needed to align the images to the patient. Many embodiments utilize a display that is mounted on a movable support structure that allows for the display to be positioned between the patient and the physician. The range of motion of the support structure and the degrees of freedom enable a wide range of display positions and orientations. In one embodiment, the patient is lying on an exam table with the physician standing by the patient's side. The support structure allows the display to be brought over the patient. The physician can move and orient the display so the display is located roughly between the physician and the patient. Providing a display over the operative area of the patient allows the physician to perform minimally invasive procedures with needles, Guidewires, and catheters as if the physician were performing open surgery by looking directly into the patient.

Techniques are also disclosed to track the position of the display, the imaging source, the patient, and the medical device. Tracking individual elements of the system allows the image to be aligned with the patient and constantly updated to accommodate for a moving patient, moving medical device, or moving display.

A live image of the patient anatomy may also be shown on a display located over the patient. Sensors track the position and orientation of the display screen and the imaging source so that the position and orientation of the display screen may control position and orientation of the imaging source, keeping the anatomical image, the medical device image, and the patient substantially co-aligned. Alternatively, sensors track the position and orientation of the display screen and the imaging source so that the position and orientation of the imaging source may control position and orientation of the display screen, to keep the anatomical image, the display screen, the medical device image, and the patient substantially co-aligned. The live image may be supplemented with other anatomical images from live or static sources that are sensored, registered, and displayed in the same substantially co-aligned manner on the display screen. For example, a live endoscopic image may be superimposed over a three-dimensional image of the prostate derived from a pre-operative MR scan. As the physician moves the display to view the three-dimensional image from different angles, the endoscope may be remotely automatically re-positioned so that the live image viewing position matches the viewing position of the three-dimensional image.

All embodiments create a coupling between the image position and orientation and the position and orientation of a secondary system component. This invention improves the workspace of the system by providing an input device to temporarily decouple the relationship to reposition the display or secondary system component for improved workspace. Additionally, this invention improves the ergonomics by allowing for a scaling factor between the coupled display and secondary system component.

In another embodiment the system comprises a processor further adapted to receive image data for the patient's anatomy. Such image data may be a static image obtained by M RI, ultrasound. X-ray, computed tomography or fluoroscopic imaging modalities. The image data can also be a live fluoroscopic image collected in real-time. The system can further track patient position by one or more of the following: fiducial markers, live imaging data, optical sensors, or electromagnetic sensors. The processor is also further adapted to receive position data from a tool, which is tracked by electromagnetic sensors. The display is held by a support arm having at least one degree of freedom, wherein the members and joints of the support arm may be operatively coupled to counterbalance springs or weights. The processor is further adapted to receive position data of the display, which is tracked by one or more of the following: optical tracking, electromagnetic sensors, or encoded joints of the support arm. The processor processes the various position data and image data to display an image of the patient's anatomy substantially aligned with the patient's actual anatomy superimposed with the position of any device being tracked. The processor is also adapted to direct any live imaging equipment to ensure proper functioning of the system. When used in a surgical setting the invention may be located in the surgical field and may also comprise a sterile drape for the display to protect the integrity of the surgical field.

In one embodiment, a live image of the patient anatomy is shown on a repositionable display screen located over the patient. The physician can move the display over the patient while sensors track the motion of the display so that the image shown on the display screen may be periodically or constantly updated to show the medical device, and the patient anatomy substantially aligned with the patient from the perspective of the user with a slight angular bias toward the user. The position of the user relative to the patient may be entered by the user at the start of the procedure by touching a button on the display labeled “patient left,” “patient right,” “patient head,” or “patient feet.” In this manner, the image shown on the display provides a view of the medical device and patient anatomy that is intuitive, ergonomic, and allows for easy navigation of the medical device within the patient anatomy shown on the display screen. While the image of the anatomy is frequently based on a pre-operative image, a live image may be supplemented with other anatomical images from live or static sources which are sensored, registered, and displayed in the same substantially co-aligned manner on the display screen.

In additional embodiments, a sensor on the medical device provides position and orientation data of the device to a data processor. A sensor on the patient provides position and orientation data of the patient to the processor, and sensors on the display screen provide the viewing position and orientation of the display screen to the processor. With data from the medical device, the patient, and the display, the processor unifies the three coordinate systems so that the image shown on the display screen substantially matches the position of the patient anatomy. Adjustments to the display position over the patient result in similar changes to the position of the image in the display: changing the position of the display changes the view of the image on the display screen. For example, the user may change the angle of the display to change the angle of the apparent image on the display screen or may translate the display to pan the image in the display along the patient to show different anatomy. Aligning the positions of the shown image and the patient anatomy helps coordinate the physician's control of the medical device.

Elements of both embodiments may be combined to display preoperative and intra-operative anatomical images within the same procedure. In both embodiments, the invention provides a virtual window into the patient where the physician may view the anatomy and navigate the surgical device in substantial alignment with the patient. For example, sensored endoscope may be show relative to the aligned anatomical image. An anatomical target may be chosen and marked on the image. As sensored medical devices are moved to different potential access points on the body, the ability to reach the anatomical target may be shown by projecting the path of the device to the target and presenting a positive indication when the path to the anatomical target is uninterrupted. Similar real-time updates may be used to assist in quickly choosing access points for minimally invasive devices by showing whether adjacent medical devices will collide with each other, external anatomy, or internal anatomy as different potential access points on the body are selected by moving the medical device to those access points.

BRIEF DESCRIPTION OF THE DRAWINGS

The novel features of the invention are set forth with particularity in the claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings of which:

FIG. 1 is a side diagrammatic view of a system for displaying a substantially co-aligned anatomical image with a sensored medical device over a patient's anatomy.

FIG. 2 is a block diagram showing data flow for the system in FIG. 1.

FIG. 3 is an isometric view of an embodiment of the display and support arm positioned next to the patient table with the projected workspace of a robotic surgical device overlaid on the anatomy in the display.

FIG. 4 is an isometric view of an embodiment of the display and support arm positioned next to the patient table.

FIG. 5 is a flow chart describing the basic steps for a minimally invasive procedure using a sensored medical device and the system for displaying a co-aligned image.

DETAILED DESCRIPTION

While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.

FIGS. 1-2 describe an embodiment for navigating a minimally invasive medical device within the patient using an acquired three-dimensional anatomical image shown in a display 7 that is substantially aligned to the patient anatomy. A sterile cover may be used to separate the display from the sterile operating field and the sterile cover may incorporate a conductive film to provide a sterile touch interface for a capacitive touch screen display. The sterile display cover may be a flexible, clear drape made of plastic like polyethylene or polyurethane film, a rigid plate made of clear plastic like polycarbonate or acrylic, or a combination of both flexible and rigid plastics. The display is preferably a light-weight, flat LCD display provided by manufacturers like LG Display, Philips, and Innolux or a light-weight, flat OLED display provided by manufacturers like Samsung and Sony. A prime example of such a display would be the NEC TFT color LCD module which provides a usable viewing angle of 85° in all directions. In FIG. 1, the position of the medical device within the patient 5 is provided by an electromagnetic coil sensor located on the distal elongated section of the medical device 1. The position of the sensor is derived through an electromagnetic transmitter 2 similar to those transmitters supplied commercially by NDI and Ascension Technology Corporation. Alternatively, the position of the medical device may be derived from an optical fiber position sensor like that supplied by Luna Innovations. A similar patient reference sensor 3 is placed on the patient in a reliably stable position like the outcropping of the pelvic bone, sternum or clavicle. The reference sensor or sensors provide frequently updated data describing the position of the patient anatomy in the same coordinate system as the medical device sensor. The patch holding the patient sensor may be placed on the patient before the patient's anatomy of interest is imaged and the patch may contain known X-ray visible materials such as tungsten, platinum-iridium, platinum, barium sulfide or iodine and MR visible materials such as gadolinium or vitamin E. The patch is visible within the image of the anatomy and therefore the patient reference sensor 3 can be registered to the three dimensional anatomical image. Position data from the sensor in the medical device 1 and patient reference sensor 3 and display support arm 4 are sent to the system processor 6. The local coordinate systems of the medical device sensor 1 and display 7 may undergo a coordinate system transformation in the system processor so that the positions of the device sensor, patient sensor, and display may be evaluated in a single world coordinate system. Display 7 has a user input button 8. FIG. 2 shows the flow of sensor position data from the sensor buffer 9 to the system processor 10 where the position sensor data is used by the processor to place an icon of the medical device into the three-dimensional patient anatomy image for display through the system display 11. The system processor is a standard computing system like those supplied by Dell or Hewlett Packard running an operating system like Windows or Linux. Position data from the system display and support arm is likewise used by the system processor to orient the image on the screen so that the image, based on display position data from the display 7 and support arm 4 and patient position data from the patient reference sensor 3, is substantially aligned with the patient anatomy. Display position data may also be used to modify the image in the display, for example zooming or clipping the image as the display moves closer to the patient. Other image modifications may include changing transparency, removing layers, removing anatomical structures, or changing colors. Additionally, scaling of the image in discrete steps or image modifications may be done via a touch sensitive surface on the display.

FIG. 3 shows a movable display 12 positioned over a surgical table showing an image of the patient anatomy. A target 13 may be chosen on the image of the anatomy. A remote electromagnetic transmitter, such as those commercially available from Northern Digital Incorporated (NDI) and Ascension Technology Corporation, is positioned near or under the table to localize sensors 15 on at least one medical device 16. As the display is moved, the image of the anatomy, the medical devices, projected the path 14 of the medical devices, and the collision boundaries of the medical devices is repositioned to provide the optimum view for navigation of the medical device within the anatomical image. The access points may be chosen to optimize the ability of the medical devices to reach the anatomical target without creating collisions of the medical devices that are internal and external to the patient and to optimize the ability of the medical devices to reach the target anatomy without intersecting other anatomical structures. Software may be employed to present the collision-free projected path to the anatomical target in an intuitively obvious manner by, for example, showing free path as a green line and a path with collisions as a red line.

FIG. 4 presents an embodiment of the display and support arm with counterbalanced joints at the support arm elbow 18, and shoulder 19. An additional rotational or linear joint is provided at the base of the shoulder 20 to allow the display to move along the inferior to superior axis of the patient. All support arm joints may be encoded to provide data describing the position of the display. The display support is shown in an embodiment where the arm is mounted to a portable cart that is positioned next to the patient table. Axis 17 allows the display to rotate. An alternate embodiment may attach to the table or imaging system.

FIG. 5 provides an overview of the procedure flow for a minimally invasive procedure using a stored image for navigation. The patient anatomy is scanned 21 with a non-invasive imaging modality like CT, MR or rotational angiography. The imaged anatomy is stored and segmented into a three dimensional image, and borders and centerlines of anatomical structures are calculated using commercially available software from vendors like Philips, Siemens, GE, Toshiba. TerraRecon, Calgary Scientific, Materialise, or Osirix. The image is transferred to the memory of the system processor and the image is registered 22 to the system coordinate system along with the patient and the medical device sensors. Registration of the image may be done by imaging the patient with an image-visible skin patch, by touching a sensored probe to prominent bony anatomical points, or with an externally anatomical marker placed on the patient. At least three separate points of the patch are visible in the image and then a position sensor is placed into the patch. The visible points on the patch or bones may be selected on the displayed image and then the known distance from the marker is used to register the image to the patient position sensor. The patient position sensor and medical device position sensor are inherently registered because their positions are determined by the same sensing system. Next, the registered image is shown 23 above the patient in a manner substantially aligned to the patient anatomy. The image position may be biased slightly toward the user to provide improved ergonomics. For example, if the user is on the right side of the patient, the user may press a button on the display touch screen to inform the system of the user's operating position. The system processor will then bias the image rotationally by a small amount, usually 15-30 degrees, toward the user. The system may also bias rotational scaling in the user's direction, creating a rotation scale factor that increases slightly as the display is moved rotationally away from the user. In this way, the image is biased toward ergonomically comfortable viewing positions for the user without losing the substantial alignment of the image to the patient that provides for improved perception and usability. The medical device may be navigated 24 within or near the patient as the position sensor in the medical device is tracked and presented as an image icon within the image of the patient anatomy. The image of the anatomy and the image of the medical device may be shown with varying degrees of transparency to maximize the visibility of the device and anatomical images. The display, showing the image of the medical device within the image of the anatomy, may be repositioned 25 to enhance the viewing angle of the anatomy. As the display is moved, the image on the screen is updated to maintain substantial alignment between the displayed anatomical image and the patient anatomy.

Claims

1. (canceled)

2. (canceled)

3. (canceled)

4. (canceled)

5. (canceled)

6. (canceled)

7. (canceled)

8. (canceled)

9. (canceled)

10. (canceled)

11. (canceled)

12. (canceled)

13. (canceled)

14. (canceled)

15. (canceled)

16. (canceled)

17. (canceled)

18. (canceled)

19. (canceled)

20. (canceled)

21. (canceled)

22. (canceled)

23. (canceled)

24. (canceled)

25. (canceled)

26. (canceled)

27. (canceled)

28. (canceled)

29. A medical system for displaying a tool in patient anatomy, comprising:

a repositionable display coupled to a support arm including at least one rotational joint, wherein the repositionable display is mounted on a portable cart and configured to show a first image and a second image of a patient simultaneously, the first image comprising a live endoscopic image of the patient and the second image comprising an image derived from a pre-operative scan of the patient;
a tool insertable into the patient, the tool comprising a sensor for providing position sensor data of the tool to a data processor, wherein the position sensor data is used by the data processor to place an icon of the tool into the image derived from the pre-operative scan; and
a robotic device configured to control movement of the tool to an anatomical target within the patient,
wherein the data processor is configured to mark the anatomical target on the image derived from the pre-operative scan and present a collision-free projected path to the anatomical target that is overlaid on the image derived from the pre-operative scan on the repositionable display.

30. The medical system of claim 29, wherein the support arm comprises counterbalanced joints at a support arm elbow.

31. The medical system of claim 29, wherein the second image of the patient is derived from a computed tomography scan.

32. The medical system of claim 29, wherein the second image of the patient includes a live fluoroscopic image of the patient.

33. The medical system of claim 29, wherein the second image of the patient is a three-dimensional image segmented from a computed tomography scan.

34. The medical system of claim 29, wherein the tool comprises a catheter.

35. The medical system of claim 34, wherein the catheter comprises a urethral catheter.

36. The medical system of claim 29, wherein the tool comprises an endoscope.

37. The medical system of claim 29, wherein the tool comprises a biopsy needle.

38. The medical system of claim 29, wherein the sensor of the tool comprises an electromagnetic sensor.

39. The medical system of claim 29, wherein the sensor comprises a fiber optic sensor.

40. The medical system of claim 29, wherein the data processor provides a positive indication when the collision-free projected path is uninterrupted as the tool moves through patient anatomy.

41. The medical system of claim 29, wherein the data processor further presents collision boundaries of the tool within the patient anatomy.

42. A medical system for displaying a tool in patient anatomy comprising:

a repositionable display configured to show a first image and a second image of a patient simultaneously, the first image comprising an intraoperative image of the patient and the second image comprising an image derived from a pre-operative scan of the patient;
a tool insertable into the patient, the tool comprising a sensor for providing position sensor data of the tool to a data processor, wherein the position sensor data is used by the data processor to place an icon of the tool into the image derived from the pre-operative scan; and
a robotic device configured to control movement of the tool to an anatomical target within the patient,
wherein the data processor is configured to mark the anatomical target on the image derived from the pre-operative scan and present a collision-free projected path to the anatomical target that is overlaid on the image derived from the pre-operative scan on the repositionable display.

43. The medical system of claim 42, wherein the repositionable display is coupled to a support arm.

44. The medical system of claim 43, wherein the support arm comprises an elbow with a rotational joint.

45. The medical system of claim 43, wherein the support arm is mounted on a portable cart.

46. The medical system of claim 42, wherein the second image of the patient is derived from a computed tomography scan.

47. The medical system of claim 42, wherein the sensor of the tool comprises an electromagnetic sensor.

48. The medical system of claim 42, wherein the sensor comprises a fiber optic sensor.

49. The medical system of claim 42, wherein the tool comprises a biopsy needle.

50. The medical system of claim 42, wherein the data processor provides a positive indication when the collision-free projected path is uninterrupted as the tool moves through patient anatomy.

51. The medical system of claim 42, wherein the data processor further presents collision boundaries of the tool within the patient anatomy.

Patent History
Publication number: 20210393160
Type: Application
Filed: May 28, 2021
Publication Date: Dec 23, 2021
Inventors: Daniel WALLACE (Santa Cruz, CA), Gregory STAHLER (San Jose, CA), Aaron GROGAN (Scotts Valley, CA)
Application Number: 17/334,628
Classifications
International Classification: A61B 5/06 (20060101); A61B 5/00 (20060101); A61B 90/00 (20060101);