Endoscope control system
An endoscope system comprises an endoscope and a display to display image content captured by the endoscope. The endoscope system also comprises one or more sensors located in a headrest and configured to detect an input at the headrest. The endoscope system also comprises a control module configured to receive one or more sensor signals from the one or more sensors. The one or more sensor signals indicate movement of the headrest with respect to a support on which the headrest is mounted or pressure applied to the headrest. The control module is also configured to adjust the image content displayed by the display in response to the one or more sensor signals.
Latest INTUITIVE SURGICAL OPERATIONS, INC. Patents:
The present application is a continuation of U.S. application Ser. No. 16/292,104 filed Mar. 4, 2019, which is a divisional of the U.S. patent application Ser. No. 14/909,976, filed Feb. 3, 2016, which is a U.S. National Stage patent application of International Application No. PCT/US2014/050217, filed on Aug. 7, 2014, which claims the benefit of U.S. Provisional Patent Application 61/865,996, filed on Aug. 14, 2013, the disclosures of each of which are incorporated herein by reference in their entirety.
TECHNICAL FIELDEmbodiments of the present invention are related to instrument control, and in particular to control of instruments used in minimally invasive robotic surgery.
DISCUSSION OF RELATED ARTSurgical procedures can be performed through a surgical robot in a minimally invasive manner. The benefits of a minimally invasive surgery are well known and include less patient trauma, less blood loss, and faster recovery times when compared to traditional, open incision surgery. In addition, the use of robot surgical systems (e.g., teleoperated robotic systems that provide telepresence), such as the da Vinci™ Surgical System manufacture by Intuitive Surgical, Inc. of Sunnyvale, Calif., is known. Such robotic surgical systems may allow a surgeon to operate with intuitive control and increased precision when compared to manual minimally invasive surgeries.
In a minimally invasive surgical system, surgery is performed by a surgeon controlling the robot. The robot includes one or more instruments that are coupled to robot arms. The instruments access the surgical area through small incisions through the skin of the patient. A cannula is inserted into the incision and a shaft of the instrument can be inserted through the cannula to access the surgical area. An endoscope can be used to view the surgical area. In many cases, the surgeon can control one instrument at a time. If the surgeon wants to change the view of the endoscope, control is shifted from the current surgical instrument to the endoscope, the surgeon manipulates the endoscope, and control is shifted back to the surgical instrument.
Therefore, there is a need to develop better surgical systems for robotic minimum invasive surgeries.
SUMMARYIn accordance with aspects of the present invention, movement of an image of the surgery can be controlled by motion of the surgeon's head or face at the surgeon's console. In some embodiments, for example, a surgeon's console includes an image display system that displays an image of a surgical area; and at least one sensor mounted in the surgeon's console to provide a signal related to a movement of the surgeon's face, the image being moved according to the signal.
In some embodiments, a headrest for a surgical console includes a forehead rest surface; a headrest mount that can attach to the surgical console; and one or more sensors in the headrest that detect inputs from a surgeon's head and provides signals to an endoscope control.
In some embodiments, an endoscope control system includes endoscope controls that receive signals that indicate movement of a surgeon's head and provide an indication of movement of an image received by an endoscope; endoscope manipulation configured to receive the indication of movement of an image and generate signals to affect movement of the endoscope to control the movement of the image; and actuators that can be coupled to the endoscope, the actuators receive the signals to affect movement and control the endoscope to provide the movement.
These and other embodiments are further discussed below with respect to the following figures.
In the following description, specific details are set forth describing some embodiments of the present invention. It will be apparent, however, to one skilled in the art that some embodiments may be practiced without some or all of these specific details. The specific embodiments disclosed herein are meant to be illustrative but not limiting. One skilled in the art may realize other elements that, although not specifically described here, are within the scope and the spirit of this disclosure.
This description and the accompanying drawings that illustrate inventive aspects and embodiments should not be taken as limiting—the claims define the protected invention. Various mechanical, compositional, structural, and operational changes may be made without departing from the spirit and scope of this description and the claims. In some instances, well-known structures and techniques have not been shown or described in detail in order not to obscure the invention.
Additionally, the drawings are not to scale. Relative sizes of components are for illustrative purposes only and do not reflect the actual sizes that may occur in any actual embodiment of the invention. Like numbers in two or more figures represent the same or similar elements.
Further, this description's terminology is not intended to limit the invention. For example, spatially relative terms—such as “beneath”, “below”, “lower”, “above”, “upper”, “proximal”, “distal”, and the like—may be used to describe one element's or feature's relationship to another element or feature as illustrated in the figures. These spatially relative terms are intended to encompass different positions (i.e., locations) and orientations (i.e., rotational placements) of a device in use or operation in addition to the position and orientation shown in the figures. For example, if a device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be “above” or “over” the other elements or features. Thus, the exemplary term “below” can encompass both positions and orientations of above and below. A device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly. Likewise, descriptions of movement along and around various axes includes various special device positions and orientations. In addition, the singular forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context indicates otherwise. And, the terms “comprises”, “comprising”, “includes”, and the like specify the presence of stated features, steps, operations, elements, and/or components but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups. Components described as coupled may be electrically or mechanically directly coupled, or they may be indirectly coupled via one or more intermediate components.
Elements and their associated aspects that are described in detail with reference to one embodiment may, whenever practical, be included in other embodiments in which they are not specifically shown or described. For example, if an element is described in detail with reference to one embodiment and is not described with reference to a second embodiment, the element may nevertheless be claimed as included in the second embodiment.
Aspects of embodiments of the invention are described within the context of a particular implementation of a robotic surgical system. Knowledgeable persons will understand, however, that inventive aspects disclosed herein may be embodied and implemented in various ways, including robotic and non-robotic embodiments and implementations. The implementations disclosed here are merely exemplary and are not to be considered as limiting the scope of the inventive aspects disclosed herein.
As is further illustrated in
Further, portions of each of the instrument arms 106a, 106b, 106c, and 108 are adjustable by personnel in the operating room in order to position instruments 110a, 110b, 110c, and endoscope 112 with respect to a patient. Other portions of arms 106a, 106b, 106c, and 108 are actuated and controlled by the surgeon at a surgeon's console 120. Surgical instruments 110a, 110b, 110c, and endoscope 112, can also be controlled by the surgeon at surgeon's console 120.
Additional controls are provided with foot pedals 128. Each of foot pedals 128 can activate certain functionality on the selected one of instruments 110. For example, foot pedals 128 can activate a drill or a cautery tool or may operate irrigation, suction, or other functions. Multiple instruments can be activated by depressing multiple ones of pedals 128. Certain functionality of instruments 110 may be activated by other controls.
Surgeon's console 120 also includes a stereoscopic image display 126. Left side and right side images captured by the stereoscopic endoscope 112 are output on corresponding left and right displays, which the surgeon perceives as a three-dimensional image on display system 126. In an advantageous configuration, the MTMs 122 are positioned below display system 126 so that the images of the surgical tools shown in the display appear to be co-located with the surgeon's hands below the display. This feature allows the surgeon to intuitively control the various surgical tools in the three-dimensional display as if watching the hands directly. Accordingly, the MTM servo control of the associated instrument arm and instrument is based on the endoscopic image reference frame.
The endoscopic image reference frame is also used if the MTM's 122 are switched to a camera control mode. In some cases, if the camera control mode is selected, the surgeon may move the distal end of the endoscope 112 by moving one or both of the MTM's 122 together (portions of the two MTM's 122 may be servomechanically coupled so that the two MTM portions appear to move together as a unit). The surgeon may then intuitively move (e.g., pan, tilt, zoom) the displayed stereoscopic image by moving the MTM's 122 as if holding the image in the hands.
As is further shown in
The surgeon's console 120 is typically located in the same operating room as the patient side cart 100, although it is positioned so that the surgeon operating the console is outside the sterile field. One or more assistants typically assist the surgeon by working within the sterile surgical field (e.g., to change tools on patient side cart 100, to perform manual retraction, etc.). Accordingly, the surgeon operates remote from the sterile field, and so the console may be located in a separate room or building from the operating room. In some implementations, two consoles 120 (either co-located or remote from one another) may be networked together so that two surgeons can simultaneously view and control tools at the surgical site.
During a typical surgical procedure with the robotic surgical system described with reference to
As shown in
During surgery, particularly if the surgery is abdominal surgery, pressurized CO2 can be utilized to expand the abdomen, allowing for better access to surgical area 210. Cannula seals attached to cannula seal mounts 212a, 212b, and 212d prevent leakage of fluids or other materials from the patient.
During the operation, the surgeon sitting at surgeon's console 120 can manipulate end effectors 206a, 206b, and 206d as well as move shafts 152a, 152b, and 152d along their lengths. In the particular arrangement illustrated in
According to some embodiments of the invention, a sensing method allows for the surgeon to manipulate the headrest in order to control, for example, the endoscopic camera while separately using MTMs 122 to control the surgical instruments. Some embodiments of the present invention can eliminate the need to switch modes from instrument control to camera control, and then back again, when it is necessary to reposition the camera. In some embodiments, positioning the camera or control of the camera zoom level can be accomplished while the surgical instruments are actively being controlled by the surgeon.
As shown in
Shaft 152d is connected to instrument interface 150d. Instrument interface 150d, as shown in
In practice, the optics in end effector 206d can include an ability to zoom the image into or out of surgical area 210. Further, instrument interface 150d or instrument arm 106d has the ability to move endoscope 112 laterally along the axis of shaft 152d, thereby providing a zoom function. Whether a zoom feature in end effector 206d or movement of shaft 152d is used to zoom on an image can be controlled by software operating in the surgical system. End effector 206d can also be moved within a spherical surface by manipulating wrist 312. Movement of end effector 206d with wrist 312 can be used to provide different images of surgical area 210.
Endoscope controls 402 may include processing capability to receive signals from one or more sensors and determine from those signals what the surgeon intends for the change in the image. For example, endoscope controls 402 can determine whether the surgeon requests a zoom function or whether the surgeon requests that the image be panned and in which direction the image should be panned. As such, endoscope controls 402 may include one or more processors coupled with memory (volatile, nonvolatile, or a combination) to hold data and programming instructions. The programming instructions may include instructions to translate signals received from the one or more sensors into signals that represent the requested action of the image produced by endoscope 112.
Endoscope manipulation calculation 404 provides signals to actuators 406. Actuators 406 are mechanically coupled to instrument interface 150d on endoscope 112. Therefore, endoscope manipulation calculation 404 translates the signals received from endoscope controls 402 into actions performed by actuators 406 that result in the corresponding motion of end effector 206d of endoscope 112. As discussed above, the motion of end effector 206d can be axial in end effector 206d (zooming end effector 206d using internal optics or by movement of end effector 206d along its axis), can be lateral by movement of wrist 312 which results in movement of the tip of end effector 206d along a substantially spherical surface, or can result in axial motion of endoscope 112 along the axis of shaft 152d. Zoom and image adjustments can be performed by combinations of various motions that are communicated through instrument interface 150d.
Endoscope manipulation calculation 404 can include a processor executing instructions that calculate the motions that actuators 406 perform in order to result in the motion according to the surgeon input at endoscope controls 402. As discussed above with respect to endoscope controls 402, endoscope manipulation calculation 404 can include one or more processors coupled to memories (volatile, nonvolatile, or a combination) that hold data and programming. In some embodiments, endoscope controls 402 and endoscope manipulation calculation 404 can be performed by the same processors executing the appropriate program instructions.
In some cases, endoscope controls 402 can include MTMs 122. In accordance with some embodiments of the present invention, endoscope controls 402 can include sensors in headrest 130 and can be controlled by the surgeon's motion of his head on headrest 130. Endoscope controls 402 included in headrest 130 are discussed in further detail below. In some embodiments, endoscope controls 402 can include sensors positioned on surgeon's console 120 that track the motion of the surgeon's head.
Endoscope manipulation calculation 404 provides signals to operate actuators 406. Actuators 406 are generally rotary motors housed in patient side cart 100 arm 108, on which endoscope 112 is attached, and drive interface 150d and arm 108. As discussed above, instrument interface 150d translates the mechanical inputs of actuators 406 into movement of wrist 312 and end effector 206d.
Endoscope controls 402 can also control the light output of illumination 410. Illumination 410 provides light through optical fiber in endoscope 112 in order to illuminate surgical area 210 (
As illustrated in
In step 454, the action requested by the surgeon is determined by endoscope controls 402 based on the signals from the one or more sensors. Such actions can include panning the image generated by endoscope 112 or zooming in or out of the image generated by endoscope 112. For example, a detected rotation of the surgeon's face to the right may be interpreted as a request to pan the image to the right while a movement of the surgeon's face into console 120 may be interpreted as a request to zoom into the image.
In step 456, the action requested by the surgeon determined in step 454 is translated to input actuation signals for actuators 406 that drive endoscope 112 and robot arm 108 to perform the requested action. For example, a zoom request may result in signals that drive robot arm 108 or to zoom with the optics in end effector 206d. A pan request results in activation of wrist 312 in the appropriate direction through interface 150d. In step 458, the actuation signals are applied to actuators 406 to perform the requested action.
In some cases, headrest 130 can be molded out of foam and covered with, for example a vinyl covering, for both decoration and functionality.
The shape of mounting portion 508 is dependent on the mounting of headrest 130 onto surgeon's console 120. As such, the shape of mounting portion 508 can be as varied as the number of mounting configurations that can be used for attaching headrest 130 onto surgeon's console 120.
In accordance with some embodiments of the present invention, sensors are embedded within or on headrest 130 to allow the surgeon to provide input signals for endoscope controls 420 by motion of the surgeon's head. In some embodiments, for example, a pressure sensor array can be embedded in headrest 130. The pressure sensor array can sense pressure that the surgeon applies to areas of the front surface of forehead rest 502. The pressure data from the pressure sensor array can then be converted into endoscope control data. In some embodiments, a rocker plate can be inserted into headrest 130. The rocker plate can operate, for example, similarly to a joystick so that endoscope control data can be obtained by the motion of the surgeon's head against the front surface of forehead rest 502. In some embodiments, an optical arrangement can be provided to read the movement of a slip plate mounted on headrest 130. The motion of the slip plate is controlled by the surgeon's head motion and can be converted to control data.
In some further embodiments, a face tracker system can be mounted on headrest 130 or directly on surgeon's console 120. The face tracker can be used to track the motion of the surgeon's face and convert that motion to endoscope control data. In some embodiments, an iris tracker system can be included in display 126 that can be used to track the motion of the surgeon's eyes. Depending on the type of viewer in display 126, the iris tracker sensors can be included in the optics or, if the viewer is a video screen, can be mounted on headrest 130 or on surgeon's console 120 so as to track the motion of the surgeon's eyes and convert that motion to endoscope control data.
Some embodiments of the current invention include endoscope controls 402 attached to or within headrest 130. Endoscope controls 402 include sensing techniques that can control some or all of the position and zoom level (optically or digitally) of an endoscope 112 in a surgical robotic system. In some embodiments, the sensing techniques can capture a sensor signature in two-dimensions to determine the direction of camera movement, and in a third dimension to control the zoom (in/out motion) of the endoscope camera. As such, embodiments of the present invention provide an alternative mode for the surgeon to enter where the endoscope camera is actively controlled simultaneously with the surgical instruments. Many of these systems are further discussed below. In some, a sensor input device is mounted into or onto headrest 130 in order to track the surgeons head motions. The head motion signals are then converted to endoscope control signals in endoscope controls 402 as shown in
As illustrated in
As shown in
Pressure sensing array 602 is integrated into headrest 130, which is mounted on surgeon's console 120, within the foam under forehead rest 502, where the surgeon rests his/her forehead. Surgeon's console can then be electrically coupled to pressure sensing array 602 to record the pressure signature of the surgeon's forehead against forehead rest 502. As shown in
For example, to move end effector 206d of end effector 112 such that the image viewed at display 126 is moved to the right, the surgeon can, for example, roll their head slightly to the left to create a pressure profile with larger magnitudes in the left hand side of the array. The pressure profile for this example is illustrated in
In some embodiments, the velocity of the image movement can be a constant, which may be set by a surgeon input elsewhere on surgeon's console 120. In some embodiments, the velocity of the image movement can vary based on the magnitude of the forces within the signature as shown in
In addition to audible feedback, visual feedback and haptic feedback, or other feedback mechanisms can be used to communicate information to the surgeon. Visual feedback, for example, can be provided to the surgeon through display system 126 and may, for example, be a flashing light with frequency indicating the speed of motion or may be color coded so that different colors indicate different speeds. Additionally, haptic feedback may be included in headrest 130. For example, through haptic feedback in headrest 130 a vibration, the frequency of which indicates the speed, is transmitted to the surgeon.
In some embodiments, a pressure profile indicating force perpendicular to the surgeon's forehead can indicate a request in/out motion of the endoscope 112 (motion along the endoscope shaft 152d), or to control the level of zoom. For example, as illustrated in
In some embodiments, surgeon headrest 130 can include detectors, for example proximity detectors, that determine the location of the surgeon's head from a fixed point. The fixed point can, for example, represent the tip of the endoscope camera (i.e. the tip of end effector 206d). Movement in the surgeon's head can then control endoscope motion, including image location and zoom.
As discussed above, sensors 702 can be coupled to provide signals for analysis in endoscope controls 402. Endoscope controls 402 then can determine the location and/or the orientation of the surgeon's forehead. There may be any number of sensors 702. Sensors 702 can, for example, be proximity sensors that measure the distance to the surgeon's forehead. For example, a single centered proximity sensor can be used as a zoom control, moving the camera in and out as the surgeon's forehead moves closer and further from forehead rest 502. Other sensors can be used to determine side-to-side or up-and-down motions of the surgeon's forehead. Therefore, as the surgeon's head moves, the distance from the fixed point defined by the collection of sensors 702 is measured, and used as an input to control the camera. The perpendicular distance from the fixed point could be used to create a relationship between the zoom level and the distance from the fixed point to actively control the zoom. For example, as the surgeon's head rolls to the left, sensors 702 on the left of forehead rest 502 may measure closer distances and sensors 702 on the right of forehead rest 502 may measure farther distances. This data can be used in endoscope controls 402 to indicate that the surgeon has rolled his head to the left and endoscope 112 can be controlled accordingly.
In some embodiments, face tracking can be used to track the surgeon's facial orientation and determine when and how the surgeon's face moves.
In some embodiments, an iris tracking system can be utilized.
To address the safety concern of accidently moving the camera such that the instruments are outside the field of view the implementation could constrain the camera motion to a predefined region. The control strategy could also integrate tool tracking techniques to allow arbitrary camera motion as long as the instrument tips stay with the field of view. Tool tracking could also be used to ensure that the camera does not collide with the surgical instruments during motion.
In some embodiments, a clutching mechanism may also be included. For example, embodiments of the present invention may be activated with a foot pedal or by a particular motion of the head. Further, to avoid unintended movement, in some embodiments only particularly large motions may result in active control of endoscope 112.
The above detailed description is provided to illustrate specific embodiments of the present invention and is not intended to be limiting. Numerous variations and modifications within the scope of the present invention are possible. The present invention is set forth in the following claims.
Claims
1. An endoscope system, comprising:
- an endoscope;
- a display to display image content captured by the endoscope;
- one or more sensors located in a headrest and configured to detect an input at the headrest, wherein the one or more sensors include a joystick controller mounted in the headrest, the joystick controller comprising a first plate attached to the headrest, a sensor plate, and a controller configured to the first plate and configured to detect movement of the headrest; and
- a control system configured to: receive one or more sensor signals from the one or more sensors, the one or more sensor signals indicating the movement of the headrest with respect to a support on which the headrest is mounted; and adjust the image content displayed by the display in response to the one or more sensor signals.
2. The endoscope system of claim 1, further comprising an endoscope manipulation calculation module configured to actuate movement of the endoscope in response to the one or more sensor signals.
3. The endoscope system of claim 1, wherein the control system adjusting the image content comprises panning or zooming the image content in the display.
4. The endoscope system of claim 1, wherein a speed of the movement of the headrest is indicated by the one or more sensor signals.
5. The endoscope system of claim 4, wherein the speed is indicated to a user by a frequency of audible clicks.
6. The endoscope system of claim 4, wherein the speed is indicated to a user by a frequency of a flashing light.
7. The endoscope system of claim 4, wherein the speed is indicated to a user by a color coded indicator.
8. An endoscope system comprising:
- an endoscope;
- a display to display image content captured by the endoscope;
- one or more sensors located in a headrest and configured to detect an input at the headrest, wherein the one or more sensors include a slip plate mounted to the headrest, the slip plate communicating with an optical detector positioned to detect motion of the slip plate, wherein the slip plate is positioned to move with a head of a user; and
- a control system configured to: receive one or more sensor signals from the one or more sensors, the one or more sensor signals indicating movement of the headrest with respect to a support on which the headrest is mounted; and adjust the image content displayed by the display in response to the one or more sensor signals.
9. An endoscope system, comprising:
- an endoscope;
- one or more sensors located in a headrest and configured to detect an input at the headrest, wherein the one or more sensors include an array of pressure sensors positioned adjacent to a forehead rest surface of the headrest, each of the pressure sensors corresponding to an area of the forehead rest surface, pressure applied by a user to one or more areas of the forehead rest surface providing input to the array of pressure sensor to generate one or more sensor signals;
- one or more actuators; and
- one or more processors coupled to the one or more sensors and the one or more actuators;
- wherein the one or more processors are configured to: receive the one or more sensor signals from the one or more sensors, the one or more sensor signals indicating pressure applied to the headrest; generate one or more actuation signals based on the one or more sensor signals; and actuate the one or more actuators using the one or more actuation signals to move the endoscope.
10. The endoscope system of claim 9, wherein a speed of movement of the pressure applied by the user is indicated by the one or more sensor signals.
11. The endoscope system of claim 10, wherein the speed is indicated to the user by one or more of a frequency of audible clicks, a frequency of a flashing light, or a color coded indicator.
12. The endoscope system of claim 9, further comprising:
- a clutch mechanism configured to activate and deactivate the one or more actuators.
5911036 | June 8, 1999 | Wright et al. |
6120433 | September 19, 2000 | Mizuno |
6714841 | March 30, 2004 | Wright et al. |
10265057 | April 23, 2019 | Herzlinger et al. |
20060100642 | May 11, 2006 | Yang |
20060178559 | August 10, 2006 | Kumar et al. |
20080294300 | November 27, 2008 | Ashmore |
20090127899 | May 21, 2009 | Maguire, Jr. |
20090248036 | October 1, 2009 | Hoffman |
20110238079 | September 29, 2011 | Hannaford et al. |
20110282140 | November 17, 2011 | Itkowitz et al. |
20120069166 | March 22, 2012 | Kunz et al. |
20130038707 | February 14, 2013 | Cunningham |
20140024889 | January 23, 2014 | Xiaoli |
20140031001 | January 30, 2014 | Jacobsen |
20150342442 | December 3, 2015 | Tadano |
20190209145 | July 11, 2019 | Herzlinger et al. |
2000218575 | August 2000 | JP |
20110049703 | May 2011 | KR |
- International Search Report and Written Opinion for Application No. PCT/US14/50217, dated Nov. 19, 2014, 12 pages (ISRG04790/PCT).
- Machine Translation for KR20110049703A.
- Vertut, Jean and Phillipe Coiffet, Robot Technology: Teleoperation and Robotics Evolution and Development, English translation, Prentice-Hall, Inc., Inglewood Cliffs, NJ, USA 1986, vol. 3A, 332 pages.
Type: Grant
Filed: Jan 7, 2021
Date of Patent: Apr 9, 2024
Patent Publication Number: 20210186303
Assignee: INTUITIVE SURGICAL OPERATIONS, INC. (Sunnyvale, CA)
Inventors: Peter M. Herzlinger (Saratoga, CA), Govinda Payyavula (Sunnyvale, CA), Brian E. Miller (Los Gatos, CA)
Primary Examiner: Ryan N Henderson
Assistant Examiner: Pamela F Wu
Application Number: 17/143,672
International Classification: A61B 1/00 (20060101); A61B 17/00 (20060101); A61B 34/30 (20160101); A61B 34/37 (20160101); G06F 3/01 (20060101); G06F 3/0338 (20130101); G06F 3/041 (20060101); A61B 34/00 (20160101); A61B 90/50 (20160101); A61B 90/60 (20160101); G06F 3/0485 (20220101);