Physical medical element placement systems
An exemplary system is configured to obtain anatomical characteristic data representative of a characteristic associated with an anatomical surface to be covered by a physical medical element, the anatomical surface within an internal space of a patient and determine, based on the anatomical characteristic data, a placement guidance parameter set. The placement guidance parameter set may include one or more parameters configured to guide a placement of the physical medical element on the anatomical surface with one or more surgical instruments controlled by a computer-assisted surgical system.
Latest Intuitive Surgical Operations, Inc. Patents:
- Systems and methods for bifurcated navigation control of a manipulator cart included within a computer-assisted medical system
- Master/slave registration and control for teleoperation
- Indicator mechanism for an actuator controlled surgical instrument
- Electrically weldable suture material, and apparatus and method for forming welded suture loops and other welded structures
- Actuated grips for controller
The present application is a U.S. National Stage Application under 35 U.S.C. § 371 of International Application No. PCT/US2020/054297, filed on Oct. 5, 2020, which claims priority to U.S. Provisional Patent Application No. 62/911,853, filed on Oct. 7, 2019, the contents of which are hereby incorporated by reference in their entirety.
BACKGROUND INFORMATIONA computer-assisted surgical system is often used to perform a hernia repair procedure within a patient. As part of the hernia repair procedure, a mesh patch may be placed over the hernia and attached (e.g., sutured) to tissue surrounding the hernia. The mesh patch may provide support for the damaged tissue as the tissue heals.
During a hernia repair procedure, a surgeon interacting with the computer-assisted surgical system must determine an appropriate size for the mesh patch. Once the mesh patch has been sized (e.g., by cutting the mesh patch out of a mesh material), the surgeon must place the mesh patch at an appropriate location within the patient. These operations may be time-intensive and tedious.
SUMMARYThe following description presents a simplified summary of one or more aspects of the systems and methods described herein. This summary is not an extensive overview of all contemplated aspects and is intended to neither identify key or critical elements of all aspects nor delineate the scope of any or all aspects. Its sole purpose is to present one or more aspects of the systems and methods described herein as a prelude to the detailed description that is presented below.
An exemplary system includes a memory storing instructions; and a processor communicatively coupled to the memory and configured to execute the instructions to: obtain anatomical characteristic data representative of a characteristic associated with an anatomical surface to be covered by a physical medical element, the anatomical surface within an internal space of a patient; and determine, based on the anatomical characteristic data, a placement guidance parameter set, the placement guidance parameter set comprising one or more parameters configured to guide a placement of the physical medical element on the anatomical surface with one or more surgical instruments controlled by a computer-assisted surgical system.
An exemplary system includes a memory storing instructions; and a processor communicatively coupled to the memory and configured to execute the instructions to: obtain anatomical characteristic data representative of a characteristic associated with an anatomical surface to be covered by a physical medical element, the anatomical surface within an internal space of a patient; determine, based on the anatomical characteristic data, a suturing plan for suturing the physical medical element to the anatomical surface while the one or more surgical instruments controlled by the computer-assisted surgical system hold the physical medical element in place on the anatomical surface; and graphically indicate, within an image of the internal space, the suturing plan.
An exemplary method includes obtaining, by a medical element management system, anatomical characteristic data representative of a characteristic associated with an anatomical surface to be covered by a physical medical element, the anatomical surface within an internal space of a patient; and determining, by the medical element management system based on the anatomical characteristic data, a placement guidance parameter set, the placement guidance parameter set comprising one or more parameters configured to guide a placement of the physical medical element on the anatomical surface with one or more surgical instruments controlled by a computer-assisted surgical system.
An exemplary method includes obtaining, by a medical element management system, anatomical characteristic data representative of a characteristic associated with an anatomical surface to be covered by a physical medical element, the anatomical surface within an internal space of a patient; determining, by the medical element management system based on the anatomical characteristic data, a suturing plan for suturing the physical medical element to the anatomical surface while the one or more surgical instruments controlled by the computer-assisted surgical system hold the physical medical element in place on the anatomical surface; and graphically indicating, by the medical element management system within an image of the internal space, the suturing plan.
An exemplary non-transitory computer-readable medium stores instructions that, when executed, direct a processor of a computing device to: obtain anatomical characteristic data representative of a characteristic associated with an anatomical surface to be covered by a physical medical element, the anatomical surface within an internal space of a patient; and determine, based on the anatomical characteristic data, a placement guidance parameter set, the placement guidance parameter set comprising one or more parameters configured to guide a placement of the physical medical element on the anatomical surface with one or more surgical instruments controlled by a computer-assisted surgical system.
An exemplary non-transitory computer-readable medium stores instructions that, when executed, direct a processor of a computing device to: obtain anatomical characteristic data representative of a characteristic associated with an anatomical surface to be covered by a physical medical element, the anatomical surface within an internal space of a patient; determine, based on the anatomical characteristic data, a suturing plan for suturing the physical medical element to the anatomical surface while the one or more surgical instruments controlled by the computer-assisted surgical system hold the physical medical element in place on the anatomical surface; and graphically indicate, within an image of the internal space, the suturing plan.
The accompanying drawings illustrate various embodiments and are a part of the specification. The illustrated embodiments are merely examples and do not limit the scope of the disclosure. Throughout the drawings, identical or similar reference numbers designate identical or similar elements.
Physical medical element positioning systems and methods are described herein. As described herein, an exemplary medical element management system may obtain anatomical characteristic data representative of a characteristic associated with an anatomical surface to be covered by a physical medical element (e.g., a mesh patch configured to cover a hernia) and determine, based on the anatomical characteristic data, a placement guidance parameter set. As described herein, the placement guidance parameter set may include one or more parameters configured to guide a placement of the physical medical element on the anatomical surface with one or more surgical instruments controlled by a computer-assisted surgical system.
The systems and methods described herein advantageously provide guidance during a placement of a physical medical element over an anatomical surface within the body during a medical procedure. Such guidance may assist a user in positioning the physical medical element on the anatomical surface and/or in attaching (e.g., suturing) the physical medical element to the anatomical surface. In some alternative examples, as described herein, the guidance may facilitate automatic placement of the physical medical element on the anatomical surface. Moreover, the systems and methods described herein may minimize an amount of time required to place the physical medical element on the anatomical surface, which may be beneficial to the patient and to a surgical team involved in placing the physical medical element on the anatomical surface. Additionally or alternatively, the systems and methods described herein may minimize a variance of spacing between sutures used to place the physical medical element on the anatomical surface. Additionally or alternatively, the systems and methods described herein may account for three-dimensional contours of the anatomical surface (e.g., by using depth data) when providing the guidance during the placement of the physical medical element over the anatomical surface.
These and other advantages and benefits of the systems and methods described herein will be made apparent herein.
As used herein, a physical medical element refers to any element foreign to a patient's body that is configured to be placed on and cover an anatomical surface within the patient's body. For example, a physical medical element may be implemented by a patch (e.g., a mesh patch) configured to cover a tissue defect (e.g., a hernia, cut, or other type of lesion) within the patient. Other examples of physical medical elements that may be used in connection with the systems and methods described herein include, but are not limited to, gauze, bandages, plates, prostheses, etc. A physical medical element may be placed on an anatomical surface in any suitable manner. For example, the physical medical element may be sutured, anchored, or otherwise affixed to the anatomical surface.
Storage facility 102 may maintain (e.g., store) executable data used by processing facility 104 to perform one or more of the operations described herein. For example, storage facility 102 may store instructions 106 that may be executed by processing facility 104 to perform one or more of the operations described herein. Instructions 106 may be implemented by any suitable application, software, code, and/or other executable data instance. Storage facility 102 may also maintain any data received, generated, managed, used, and/or transmitted by processing facility 104.
Processing facility 104 may be configured to perform (e.g., execute instructions 106 stored in storage facility 102 to perform) various operations described herein.
For example, processing facility 104 may be configured to instruct a display device to render a virtual medical element representative of a physical medical element over a target region within an image of an internal space of a patient, the target region depicting an anatomical surface to be covered by the physical medical element, receive, while the virtual medical element is rendered over the target region, user input that sets at least one of a pose of the virtual medical element within the image and a size of the virtual medical element, and determine, based on the user input and on depth data representative of a depth map for the internal space, physical dimensions for the physical medical element. As described herein, the physical dimensions may define a surface area for the physical medical element.
As another example, processing facility 104 may be configured to access image data representative of an image acquired by an imaging device and depicting an internal space of a patient, obtain depth data representative of a depth map for the internal space depicted in the image acquired by the imaging device, identify, based on the image data and the depth data, a target region within the image that depicts an anatomical surface to be covered by a physical medical element, and instruct a display device to render a virtual medical element representative of the physical medical element over the identified target region within the image.
As another example, processing facility 104 may be configured to obtain anatomical characteristic data representative of a characteristic associated with an anatomical surface to be covered by a physical medical element, the anatomical surface within an internal space of a patient and determine, based on the anatomical characteristic data, a placement guidance parameter set. As described herein, the placement guidance parameter set may include one or more parameters configured to guide a placement of the physical medical element on the anatomical surface with one or more surgical instruments controlled by a computer-assisted surgical system.
As another example, processing facility 104 may be configured to obtain anatomical characteristic data representative of a characteristic associated with an anatomical surface to be covered by a physical medical element, the anatomical surface within an internal space of a patient, determine, based on the anatomical characteristic data, a suturing plan (or any other type of affixation plan) for suturing the physical medical element to the anatomical surface while the one or more surgical instruments controlled by the computer-assisted surgical system hold the physical medical element in place on the anatomical surface, and graphically indicate, within an image of the internal space, the suturing plan.
These and other operations that may be performed by system 100 (e.g., processing facility 104) are described herein.
Target region data 202 may be in any suitable format. For example, target region data 202 may include two or three-dimensional pixel coordinates representative of pixels that depict the anatomical surface that is to be covered by the physical medical element.
Exemplary manners in which image data 202 and depth data 204 may be generated will now be described.
Imaging device 302 may be implemented by an endoscope or other camera device configured to capture images of a scene. In some examples, imaging device 302 may be configured to be attached to and controlled by a computer-assisted surgical system. In alternative examples, imaging device 302 may be hand-held and operated manually by an operator (e.g., a surgeon).
In some examples, the scene captured by imaging device 302 may include a surgical area associated with a patient. The surgical area may, in certain examples, be entirely disposed within the patient and may include an area within the patient at or near where a surgical procedure is planned to be performed, is being performed, or has been performed. For example, for a minimally invasive surgical procedure being performed on tissue internal to a patient, the surgical area may include the tissue, anatomy underlying the tissue, as well as space around the tissue where, for example, surgical instruments used to perform the surgical procedure are located. In certain example implementations, the surgical area entirely disposed within the patient may be referred to as an “internal space.” As described herein, any internal anatomy of the patient (e.g., vessels, organs, and/or tissue) and/or surgical instruments located in the internal space may be referred to as objects and/or structures.
Visible light camera 304 (“camera 304”) is configured to generate image data 202 representative of a two-dimensional visible light image of a scene. Camera 304 may be implemented by any suitable image sensor, such as a charge coupled device (“CCD”) image sensor, a complementary metal-oxide semiconductor (“CMOS”) image sensor, a hyperspectral camera, a multispectral camera, or the like.
Depth sensor 306 may be implemented by any suitable sensor configured to generate depth data 204. For example, depth sensor 306 may be implemented by a time-of-flight sensor, a structured light sensor, an interferometer, a hyperspectral camera, a multispectral camera, and/or any other suitable sensor configured to acquire depth data as may serve a particular implementation. In cases where depth sensor 306 is implemented by a time-of-flight sensor, the time-of-flight sensor may be implemented by one or more photodetectors (e.g., one or more single photon avalanche diode (“SPAD”) detectors), CCD sensors, CMOS sensors, and/or any other suitable configuration. In the example of
In configuration 300, system 100 may obtain image data 202 by directing camera 304 to acquire image data 202 and receiving image data 202 from camera 304. Likewise, system 100 may obtain depth data 204 by directing depth sensor 306 to acquire depth data 204 and receiving depth data 204 from depth sensor 306.
To this end, in configuration 300, system 100 is communicatively coupled to imaging device 302 by way of a bidirectional communication link 308 and to an illumination system 310 by way of a communication link 312. Communication links 308 and 312 may each be implemented by any suitable wired and/or wireless communication medium as may serve a particular implementation. System 100 may use communication links 308 and 312 to direct camera 304 and depth sensor 306 to acquire image data 202 and depth data 204 and receive image data 202 and depth data 204, as described herein.
Illumination system 310 may be configured to emit light 314 (e.g., at the direction of system 100) used to illuminate a scene to be imaged by imaging device 302. The light 314 emitted by illumination system 310 may include visible light and/or non-visible light (e.g., infrared light). As shown, light 314 may travel to the scene through imaging device 302 (e.g., by way of an illumination channel within imaging device 302 that may be implemented by one or more optical fibers, light guides, lenses, etc.).
As shown, light 314 emitted by illumination system 310 may reflect off a surface 316 within a scene being imaged by imaging device 302. In cases where imaging device 302 is aimed at an internal space of the patient, surface 316 represents a surface (e.g., an anatomical surface) within the internal space.
Camera 304 and depth sensor 306 may each detect the reflected light 314. Camera 304 may be configured to generate, based on the detected light, image data 202 representative of a two-dimensional visible light image of the scene including surface 316. Depth sensor 306 may be configured to generate, based on the detected light, depth data 204. Image data 202 and depth data 204 may each have any suitable format.
To generate a stereoscopic image of a scene, system 100 may direct illumination system 310 to emit light 314. System 100 may also activate (e.g., turn on) visible light camera 304 and depth sensor 306. Light 314 travels to the scene and reflects off of surface 316 (and, in some examples, one or more other surfaces in the scene). Camera 304 and depth sensor 306 both detect the reflected light 314.
Camera 304 (and/or other circuitry included in imaging device 302) may generate, based on detected light 314, image data 202 representative of a two-dimensional visible light image of the scene. This may be performed in any suitable manner. Visible light camera 304 (and/or other circuitry included imaging device 302) may transmit image data 202 to system 100. This may also be performed in any suitable manner.
Depth sensor 306 may generate, based on detected light 314, depth data 204 representative of a depth map of the scene (e.g., a depth map of surface 316). This may be performed in any suitable manner. For example, depth sensor 306 may measure an amount of time that it takes for a photon of light 314 to travel from illumination system 310 to depth sensor 306. Based on this amount of time, depth sensor 306 may determine a depth of surface 316 relative to a position of depth sensor 306. Data representative of this depth may be represented in depth data 204 in any suitable manner. For example, the depth map represented by depth data 204 may include an array of depth values (e.g., Z-buffer values) corresponding to each pixel in an image.
Depth sensor 306 (and/or other circuitry included imaging device 302) may transmit depth data 204 to system 100. This may be performed in any suitable manner.
System 100 may receive image data 202 and depth data 204 and perform one or more processing operations on image data 202 and depth data 204. For example, as will be described in more detail below, system 100 may generate target region data 206 based on image data 202 and depth data 204.
As another example, system 100 may generate, based on image data 202 and depth data 204, a right-side perspective image of the scene and a left-side perspective image representative of the scene. This may be performed in any suitable manner. System 100 may then direct display devices to concurrently display the right and left-side perspective images in a manner that forms a stereoscopic image of the scene. In some examples, the display devices are included in and/or communicatively coupled to computer-assisted surgical system 204.
In
Other configurations of imaging device 302 are possible in accordance with the systems and methods described herein. For example, imaging device 302 may include multiple cameras 304 and/or multiple depth sensors 306. To illustrate, imaging device 302 may include two cameras 304 in combination with a separate depth sensor 306. In these embodiments, depth data may be generated based on the images acquired by both cameras 304. Depth data generated by depth sensor 304 may be used to fine tune or otherwise enhance the depth data generated based on the images acquired by both cameras 304.
In some examples, system 100 may obtain depth data 204 by accessing a preoperative image registered to the image from a source other than imaging device 204. For example,
System 100 may identify the target region based on image data 202 and depth data 204 in any suitable manner. For example, based on image data 202 and depth data 204, system 100 may identify a region within the image represented by image data 202 that depicts tissue in need of being covered by the physical medical element and then designate the identified region as the target region.
System 100 may identify a region of an image that depicts tissue in need of being covered by the physical medical element in any suitable manner. For example, based on image data 202 and depth data 204, system 100 may segment the image (e.g., by classifying different portions of the image as corresponding to different types of tissue) and identify the region based on the segmentation.
Additionally or alternatively, system 100 may identify a region that depicts tissue in need of being covered by the physical medical element by inputting image data 202 and depth data 204 into a machine learning model configured to identify a tissue anomaly. The machine learning model may be trained and/or used in any suitable manner.
In some examples, system 100 may determine a stage within a surgical procedure being performed with respect to the patient and further based the identification of the region that depicts tissue in need of being covered by the physical medical element on the determined stage. For example, system 100 may receive input from a surgeon that the surgeon is attempting to locate a tissue defect within the patient by scanning an internal space of the patient with an imaging device. Based on this input, system 100 may initiate a region identification heuristic that utilizes image data 202 and depth data 204 to automatically identify a region within the image acquired by imaging device that depicts tissue in need of being covered by the physical medical element.
Once the target region that depicts an anatomical surface to be covered by the physical medical element is identified, system 100 may instruct a display device to render a virtual medical element representative of the physical medical element over the identified target region within the image. This may be performed in any suitable manner.
By way of illustration,
System 100 may, based on image data 202 and/or depth data 204, identify a target region within image 602 that depicts an anatomical surface that is to be covered by a physical medical element. This may be performed in any of the ways described herein. In some examples, the anatomical surface to be covered by the physical medical element includes at least a portion of tissue defect 604. The anatomical surface to be covered by the physical medical element may, in some instances, also include at least a portion of non-defective tissue 606. For example, the anatomical surface to be covered by the physical medical element may include the entire tissue defect 604 and an overlap region made up of non-defective tissue 606 surrounding tissue defect 604. The overlap region may have any suitable width (e.g., between one and five centimeters), and may be used to attach the physical medical element to the anatomical surface, as described herein.
Tissue defect 604 and the non-defective tissue 606 may have varying relative depths. For example, tissue defect 604 and the non-defective tissue surrounding tissue defect 604 may have various ridges, peaks, valleys, and/or otherwise uneven surfaces. However, such variations in depth may not be visually ascertainable within image 602, especially if image 602 is two-dimensional. Hence, as described herein, system 100 may take such depth variations into account when determining the physical dimensions of the physical medical element that will be cover tissue defect 604 and the overlap region made up of non-defective tissue 606 surrounding tissue defect 604.
System 100 may instruct the display device to render a virtual medical element representative of the physical medical element over the identified target region within image 602. For example,
While virtual medical element 702 is illustrated as being a dashed rectangle positioned over tissue defect 604 and a portion of non-defective tissue 606, it will be recognized that virtual medical element 702 may alternatively be rendered in any other suitable manner. For example, virtual medical element 702 may be at least partially transparent to allow visualization by the user of the target region and/or tissue defect 604 while virtual medical element 702 is rendered over the target region.
While virtual medical element 702 is rendered over the target region, a user may provide user input that sets a pose and/or size of virtual medical element 702. For example, if the user determines that the initially determined pose and size of virtual medical element 702 is adequate (e.g., if virtual medical element 702 sufficiently covers the depiction of tissue defect 604), the user may provide user input that confirms that the pose and size of virtual medical element 702 are correct. Such user input may be provided in any suitable manner. For example, such user input may be provided by the user selecting an option displayed in image 602, by the user selecting a user input button on a component of a computer-assisted surgical system, by the user providing a verbal command, and/or in any other manner.
In some cases, the user may determine that the initially determined pose and/or size of virtual medical element 702 needs further refinement. For example, the user may desire to enlarge or shrink virtual medical element 702, reposition one or more edges of virtual medical element 702, and/or otherwise adjust a pose and/or size of virtual medical element 702. In these cases, the user input that sets the pose and/or size of virtual medical element 702 may include one or more user input commands that adjust the pose and/or size of virtual medical element 702. Such user input may be provided in any suitable manner. For example, the user input may be provided by the user interacting with virtual handles displayed as part of virtual medical element 702, providing one or more input commands by way of a graphical user interface within which image 602 is included, providing one or more keyboard or other input device commands, and/or in any other suitable manner. System 100 may dynamically adjust, in response to receiving the user input that adjusts the pose and/or size of virtual medical element 702, the rendering of virtual medical element 702 in substantially real time to depict at least one of the pose and the size of the virtual medical element as adjusted by the user input.
To illustrate,
In some examples, the initial shape of virtual medical element 702 is a standard size used for a particular type of physical medical element. For example, the initial shape of virtual medical element 702 may be rectangular, as shown in
Alternatively, the initial shape of virtual medical element 702 may conform more closely with the actual shape of the anatomical surface to be covered by the physical medical element. For example,
The initial shape of virtual medical element 702 may be determined automatically based on one or more attributes of the physical medical element that is to cover the anatomical surface. Additionally or alternatively, the initial shape of virtual medical element 702 may be specified by a user by providing one or more user input commands.
Based on the user input that sets the pose and/or size of virtual medical element 702 and on depth data 204, system 100 may determine physical dimensions for the physical medical element that is to cover the anatomical surface. For example,
System 100 may generate physical dimensions data 1004 based on virtual data 1002 and depth data 204 in any suitable manner. For example, system 100 may identify, based on the pose and the size of the virtual medical element as represented by virtual data 1002, a plurality of pixels within image 602 that are covered by virtual medical element 702. System 100 may determine a two-dimensional pixel area of the plurality of pixels. System 100 may determine, based on depth data 204, depth values for each of the plurality of pixels that are covered by virtual medical element 702. Based on the two-dimensional pixel area and the depth values, system 100 may determine a surface area of the anatomical surface to be covered by the physical medical element. System 100 may base the physical dimensions of the physical medical element on the surface area of the anatomical surface, Each of these operations may be performed in any suitable manner.
Because of the variations in depth that the anatomical surface may have, the surface area for the physical medical element as defined by the physical dimensions of virtual medical element 702 may, in some cases, be greater than a physical area represented by the two-dimensional pixel area. For example, the physical medical element may be made out of a material configured to “shrink wrap” or otherwise adhere to all of the surface area of the anatomical surface. In this example, if the anatomical surface has protruding ridge that needs to be covered by the physical medical element, this variation in depth may result in the surface area defined by the physical dimensions of the physical medical element as determined by system 100 being larger than a physical area represented by the two-dimensional pixel area of virtual medical element 702.
System 100 may determine the physical dimensions of the physical medical element based on deformation data 1102 in any suitable manner. For example, if deformation data 1102 indicates that the physical medical element is relatively resistant to stretching, system 100 may specify that the physical dimensions are to be relatively smaller than a different type of physical medical element that exhibits a high degree of stretching.
System 100 may obtain deformation data 1102 in any suitable manner. For example, system 100 may maintain or access a database that includes deformation data 1102 for various types of physical medical elements. System 100 may ascertain which particular physical medical element is to be used to cover the anatomical surface by receiving user input indicating which physical medical element is to be used, automatically determining which physical medical element is to be used based on tracked tendencies of a particular user and/or a particular type of surgical procedure being performed, and/or in any other suitable manner.
System 100 may output data representative of the physical dimensions in any suitable manner. For example, system 100 may output data representative of the physical dimensions by displaying the physical dimensions within image 602 or within any other suitable user interface. Additionally or alternatively, system 100 may output data representative of the physical dimensions by outputting physical measurements (e.g., a width and length in centimeters) of the physical medical element. Additionally or alternatively, system 100 may output data representative of the physical dimensions by outputting data representative of a pattern to be used to cut the physical medical element out of a material.
In some examples, system 100 may project virtual medical element 702 onto a physical material from which the physical medical element is to be cut. In this manner, virtual medical element 702 may guide a user in cutting the physical medical element out of the physical material.
To illustrate,
Once a physical medical element is ready to be introduced into the patient, system 100 may be configured to provide placement guidance configured to guide and/or assist a user in placing the physical medical element on the anatomical surface. This placement guidance may be provided in a number of different ways.
For example,
Anatomical characteristic data 1302 may be representative of one or more characteristics associated with the anatomical surface to be covered by the physical medical element. For example, anatomical characteristic data 1302 may be representative of a type of tissue that constitutes the anatomical surface, a size of the anatomical surface, and/or a location of the anatomical surface within the patient. These characteristics may affect the way in which the physical medical element is placed on the anatomical surface. For example, if the anatomical surface is located in a vicinity of an organ, system 100 may generate one or more parameters for inclusion in placement guidance parameter set 1304 that indicate that the anatomical surface is near the organ and that may be used to identify a suturing plan for attaching the physical medical element to the anatomical surface in a manner that does not damage or otherwise affect the organ. As another example, if the tissue that constitutes the anatomical surface is relatively weak, system 100 may generate one or more parameters for inclusion in the placement guidance parameter set 1304 that may be used to increase the relative number of sutures are used to attach the physical medical element to the anatomical surface.
System 100 may obtain anatomical characteristic data 1302 in any suitable manner. For example, system 100 may obtain anatomical characteristic data 1302 by accessing image data 202, obtaining depth data 204, and determining, based on image data 202 and depth data 204, anatomical characteristic data 1302. To illustrate, image data 202 and depth data 204 may be used to determine a location of the anatomical surface, a size of the anatomical surface, and/or any other characteristic of the anatomical surface.
In some examples, system 100 may use image data 202 and depth data 204 to segment the image represented by image data 202. The segmentation may include classifying different portions of the image as corresponding to different items (e.g., types of tissue). Based on the segmentation, system 100 may determine one or more characteristics of the anatomical surface, such as tissue type, etc.
Element characteristic data 1402 is representative of one or more characteristics associated with the physical medical element that will cover the anatomical surface. For example, element characteristic data 1402 may be representative of physical dimensions of the physical medical element, a type of material used for the physical medical element, a tensile strength of the physical medical element, and/or a deformation characteristic of the physical medical element.
System 100 may access element characteristic data 1402 in any suitable manner. For example, system 100 may maintain or access a database that includes element characteristic data 1402. As another example, system 100 may access element characteristic data 1402 by receiving user input representative of the access element characteristic data 1402.
In some examples, placement guidance parameter set 1304 includes one or more parameters configured to specify an optimal orientation for the physical medical element as the physical medical element is being placed on the anatomical surface with one or more surgical instruments controlled by a computer-assisted surgical system. In these examples, system 100 may indicate the optimal orientation for the physical medical element to a user in any suitable manner.
For example, system 100 may be configured to indicate the optimal orientation for the physical medical element to a user by graphically indicating, within an image (e.g., image 602) of the internal space, the optimal orientation. To illustrate,
As shown, physical medical element 1502 is being placed on the anatomical surface by surgical instruments 1504-1 and 1504-2 (“surgical instruments 1504”). Surgical instruments 1504 may include any suitable grasping tool configured to hold and guide physical medical element 1502 into place. In some examples, surgical instruments 1504 are controlled by a computer-assisted surgical system (e.g., in response to user input commands provided by a surgeon or other user).
As shown, system 100 may render virtual medical element 702 within image 602 while the user is using surgical instruments 1504 to place physical medical element 1502 on the anatomical surface. In this configuration, virtual medical element 702 graphically indicates the optimal orientation of physical medical element. In addition, graphical arrows 1506-1 and 1506-2 may be rendered by system 100 within image 602 to indicate a direction in which the user should rotate physical medical element 1502 to arrive at the optimal orientation. System 100 may additionally or alternatively present any other suitable virtual, audible, and/or graphical assistance indicative of the optimal orientation of physical medical element 1502 as may serve a particular implementation.
In some examples, placement guidance parameter set 1304 includes one or more parameters configured to specify an optimal insertion path within the internal space that surgical instruments 1504 are to follow while bringing physical medical element 1502 from outside the patient to being in contact with the anatomical surface. In these examples, system 100 may indicate the optimal insertion path for physical medical element 1502 to a user in any suitable manner. For example, system 100 may be configured to indicate the optimal insertion path for the physical medical element by graphically indicating, within an image (e.g., image 602) of the internal space, the optimal insertion path. System 100 may additionally or alternatively present any other suitable virtual, audible, and/or graphical assistance indicative of the optimal insertion path as may serve a particular implementation.
The optimal insertion path specified by the one or more parameters in placement guidance parameter set 1304 may be defined such that surgical instruments 1504 and/or physical medical element 1502 avoid collision with tissue and/or other objects (e.g., other surgical instruments) in the internal space while physical medical element 1502 is brought from outside the patient to being in contact with the anatomical surface.
In some examples, placement guidance parameter set 1304 includes one or more parameters configured to specify a suturing plan for suturing physical medical element 1502 to the anatomical surface while the surgical instruments 1504 hold physical medical element 1502 in place on the anatomical surface. The suturing plan may include information specifying a suture pattern to be used to suture physical medical element 1502 to the anatomical surface, a spacing to be used between sutures used to suture physical medical element 1502 to the anatomical surface, a type of thread to be used to suture physical medical element 1502 to the anatomical surface, a length of thread needed to suture physical medical element 1502 to the anatomical surface, and/or any other aspect of a suturing plan as may serve a particular implementation. While exemplary suturing plans are described herein, it will be recognized that placement guidance parameter set 1304 may alternatively include one or more parameters configured to specify any other type of affixation plan that may be used to affix physical medical element 1502 to the anatomical surface.
By way of example, an exemplary suturing plan that may be specified by one or more parameters in placement guidance parameter set 1304 may indicate a recommended suture pattern, suture spacing, thread type, and/or thread length for a particular type of tissue as indicated in anatomical characteristic data 1302, a particular type and/or size of tissue defect 604 as indicated in anatomical characteristic data 1302, a proximity of other objects (e.g., organs) to tissue defect 604 as indicated in anatomical characteristic data 1302, and/or any other characteristic of the anatomical surface as indicated in anatomical characteristic data 1302. Additionally or alternatively, an exemplary suturing plan that may be specified by one or more parameters in placement guidance parameter set 1304 may indicate a recommended suture pattern, suture spacing, thread type, and/or thread length for one or more of the characteristics of physical medical element 1502 as indicated in element characteristic data 1402.
In some examples, system 100 may be configured to graphically indicate the suturing plan within image 602. For example,
In some examples, a user may provide user input configured to modify the suturing plan graphically indicated within image 602. For example, a user may provide input that adjusts a position of one or more of virtual markers 1602, removes a particular virtual marker 1602, adds a new virtual marker, and/or otherwise modifies the suturing plan. In response to this user input, system 100 may dynamically adjust the suturing plan in accordance with the user input. For example, based on a repositioning by a user of a particular virtual marker 1602, system 100 may update an amount of thread needed to perform the suturing, adjust a positioning of a suturing device and/or one or more other surgical instruments, and/or otherwise adjust the suturing plan.
In some examples, suturing device 1604 is controlled by a computer-assisted surgical system (e.g., by being connected to a manipulator arm of the computer-assisted surgical system). In these examples, suturing device 1604 may be referred to as a certain type of surgical instrument coupled to and controlled by the computer-assisted surgical system. In alternative examples, suturing device 1604 is not controlled by a computer-assisted surgical system. In these alternative examples, suturing device 1604 may be manually held and/or otherwise controlled by a user.
In cases where suturing device 1604 is controlled by a computer-assisted surgical system, a positioning and/or operation of suturing device 1604 may be set in response to user input (e.g., a user may provide input commands that move and/or operate suturing device 1604). For example, the user may provide input commands that direct the computer-assisted surgical system to move suturing device 1604 from suturing location to suturing location as guided by the suturing plan represented by virtual markers 1602. For example, after suturing device 1604 is used to suture physical medical element 1502 to the anatomical surface at a suture location corresponding to virtual marker 1602-4, the user may provide input commands (e.g., by manipulating master controls that are a part of the computer-assisted surgical system) that cause suturing device 1604 to move to a suture location that corresponds to virtual marker 1602-5. Once at this location, suturing device 1604 may be used to suture physical medical element 1502 to the anatomical surface at the suture location corresponding to virtual marker 1602-5. Such repositioning of suturing device 1604 may alternatively be performed automatically by the computer-assisted surgical system without specific user input that controls a positioning and/or operation of suturing device 1604.
In some examples, system 100 may direct, in accordance with placement guidance parameter set 1304, a computer-assisted surgical system to use one or more surgical instruments (e.g., surgical instruments 1504 and/or suturing device 1604) to automatically place a physical medical element (e.g., physical medical element 1502) on the anatomical surface without a user providing user input that controls a movement of the one or more surgical instrument while the physical medical element is being placed on the anatomical surface by the one or more surgical instruments. For example, system 100 may direct a computer-assisted surgical system to use surgical instruments 1504 to guide physical medical element 1502 to a proper orientation and positioning over the anatomical surface. System 100 may then direct the computer-assisted surgical system to use suturing device 1604 to automatically suture physical medical element 1502 to the anatomical surface.
In some examples, system 100 may track a relative pose of suturing device 1604 and/or surgical instruments 1504 with respect to physical medical element 1502. System 100 may use the tracked pose to direct suturing device 1604 to properly perform the suturing of physical medical element 1502 to the anatomical surface and/or surgical instruments 1504 to properly grasp and hold physical medical element 1502.
In operation 1702, a medical element management system instructs a display device to render a virtual medical element representative of a physical medical element over a target region within an image of an internal space of a patient, the target region depicting an anatomical surface to be covered by the physical medical element. Operation 1702 may be performed in any of the ways described herein.
In operation 1704, the medical element management system receives, while the virtual medical element is rendered over the target region, user input that sets at least one of a pose of the virtual medical element within the image and a size of the virtual medical element. Operation 1704 may be performed in any of the ways described herein.
In operation 1706, the medical element management system determines, based on the user input and on depth data representative of a depth map for the internal space, physical dimensions for the physical medical element, the physical dimensions defining a surface area for the physical medical element. Operation 1706 may be performed in any of the ways described herein.
In operation 1802, a medical element management system accesses image data representative of an image acquired by an imaging device and depicting an internal space of a patient. Operation 1802 may be performed in any of the ways described herein.
In operation 1804, the medical element management system obtains depth data representative of a depth map for the internal space depicted in the image acquired by the imaging device. Operation 1804 may be performed in any of the ways described herein.
In operation 1806, the medical element management system identifies, based on the image data and the depth data, a target region within the image that depicts an anatomical surface to be covered by a physical medical element. Operation 1806 may be performed in any of the ways described herein.
In operation 1808, the medical element management system instructs a display device to render a virtual medical element representative of the physical medical element over the identified target region within the image. Operation 1808 may be performed in any of the ways described herein.
In operation 1902, a medical element management system obtains anatomical characteristic data representative of a characteristic associated with an anatomical surface to be covered by a physical medical element, the anatomical surface within an internal space of a patient, Operation 1902 may be performed in any of the ways described herein.
In operation 1904, the medical element management system determines, based on the anatomical characteristic data, a placement guidance parameter set, the placement guidance parameter set comprising one or more parameters configured to guide a placement of the physical medical element on the anatomical surface with one or more surgical instruments controlled by a computer-assisted surgical system. Operation 1904 may be performed in any of the ways described herein.
In operation 2002, a medical element management system obtains anatomical characteristic data representative of a characteristic associated with an anatomical surface to be covered by a physical medical element, the anatomical surface within an internal space of a patient. Operation 2002 may be performed in any of the ways described herein.
In operation 2004, the medical element management system determines, based on the anatomical characteristic data, a suturing plan for suturing the physical medical element to the anatomical surface while the one or more surgical instruments controlled by the computer-assisted surgical system hold the physical medical element in place on the anatomical surface, Operation 2004 may be performed in any of the ways described herein.
In operation 2006, the medical element management system graphically indicates, within an image of the internal space, the suturing plan. Operation 2006 may be performed in any of the ways described herein.
The systems and methods described herein may be used in connection with and/or implemented by a computer-assisted surgical system used to perform a surgical procedure with respect to a patient.
While
As shown in
Manipulator arms 2112 and/or surgical instruments attached to manipulator arms 2112 may include one or more displacement transducers, orientational sensors, and/or positional sensors used to generate raw (i.e., uncorrected) kinematics information. One or more components of surgical system 2100 may be configured to use the kinematics information to track (e.g., determine positions of) and/or control the surgical instruments.
User control system 2104 may be configured to facilitate control by surgeon 2110-1 of manipulator arms 2112 and surgical instruments attached to manipulator arms 2112. For example, surgeon 2110-1 may interact with user control system 2104 to remotely move or manipulate manipulator arms 2112 and the surgical instruments. To this end, user control system 2104 may provide surgeon 2110-1 with imagery (e.g., high-definition 3D imagery) of a surgical area associated with patient 2108 as captured by an imaging system (e.g., any of the medical imaging systems described herein). In certain examples, user control system 2104 may include a stereo viewer having two displays where stereoscopic images of a surgical area associated with patient 2108 and generated by a stereoscopic imaging system may be viewed by surgeon 2110-1. Surgeon 2110-1 may utilize the imagery to perform one or more procedures with one or more surgical instruments attached to manipulator arms 2112.
To facilitate control of surgical instruments, user control system 2104 may include a set of master controls. These master controls may be manipulated by surgeon 2110-1 to control movement of surgical instruments (e.g., by utilizing robotic and/or teleoperation technology). The master controls may be configured to detect a wide variety of hand, wrist, and finger movements by surgeon 2110-1. In this manner, surgeon 2110-1 may intuitively perform a procedure using one or more surgical instruments. In some examples, user control system 2104 implements user control system 806.
Auxiliary system 2106 may include one or more computing devices configured to perform primary processing operations of surgical system 2100. In such configurations, the one or more computing devices included in auxiliary system 2106 may control and/or coordinate operations performed by various other components (e.g., manipulating system 2102 and user control system 2104) of surgical system 2100. For example, a computing device included in user control system 2104 may transmit instructions to manipulating system 2102 by way of the one or more computing devices included in auxiliary system 2106. As another example, auxiliary system 2106 may receive, from manipulating system 2102, and process image data representative of imagery captured by an imaging device attached to one of manipulator arms 2112.
In some examples, auxiliary system 2106 may be configured to present visual content to surgical team members 2110 who may not have access to the images provided to surgeon 2110-1 at user control system 2104. To this end, auxiliary system 2106 may include a display monitor 2114 configured to display one or more user interfaces, such as images (e.g., 2D images, 3D images) of the surgical area, information associated with patient 2108 and/or the surgical procedure, and/or any other visual content as may serve a particular implementation. For example, display monitor 2114 may display images of the surgical area together with additional content (e.g., graphical content, contextual information, etc.) concurrently displayed with the images. In some embodiments, display monitor 2114 is implemented by a touchscreen display with which surgical team members 2110 may interact (e.g., by way of touch gestures) to provide user input to surgical system 2100.
Manipulating system 2102, user control system 2104, and auxiliary system 2106 may be communicatively coupled one to another in any suitable manner. For example, as shown in
In some examples, a non-transitory computer-readable medium storing computer-readable instructions may be provided in accordance with the principles described herein. The instructions, when executed by a processor of a computing device, may direct the processor and/or computing device to perform one or more operations, including one or more of the operations described herein. Such instructions may be stored and/or transmitted using any of a variety of known computer-readable media.
A non-transitory computer-readable medium as referred to herein may include any non-transitory storage medium that participates in providing data (e.g., instructions) that may be read and/or executed by a computing device (e.g., by a processor of a computing device). For example, a non-transitory computer-readable medium may include, but is not limited to, any combination of non-volatile storage media and/or volatile storage media. Exemplary non-volatile storage media include, but are not limited to, read-only memory, flash memory, a solid-state drive, a magnetic storage device (e.g., a hard disk, a floppy disk, magnetic tape, etc.), ferroelectric random-access memory (“RAM”), and an optical disc (e.g., a compact disc, a digital video disc, a Blu-ray disc, etc.). Exemplary volatile storage media include, but are not limited to, RAM (e.g., dynamic RAM).
As shown in
Communication interface 2202 may be configured to communicate with one or more computing devices. Examples of communication interface 2202 include, without limitation, a wired network interface (such as a network interface card), a wireless network interface (such as a wireless network interface card), a modem, an audio/video connection, and any other suitable interface.
Processor 2204 generally represents any type or form of processing unit capable of processing data and/or interpreting, executing, and/or directing execution of one or more of the instructions, processes, and/or operations described herein. Processor 2204 may perform operations by executing computer-executable instructions 2212 (e.g., an application, software, code, and/or other executable data instance) stored in storage device 2206.
Storage device 2206 may include one or more data storage media, devices, or configurations and may employ any type, form, and combination of data storage media and/or device. For example, storage device 2206 may include, but is not limited to, any combination of the non-volatile media and/or volatile media described herein. Electronic data, including data described herein, may be temporarily and/or permanently stored in storage device 2206. For example, data representative of computer-executable instructions 2212 configured to direct processor 2204 to perform any of the operations described herein may be stored within storage device 2206. In some examples, data may be arranged in one or more databases residing within storage device 2206.
I/O module 2208 may include one or more I/O modules configured to receive user input and provide user output. UO module 2208 may include any hardware, firmware, software, or combination thereof supportive of input and output capabilities. For example, I/O module 2208 may include hardware and/or software for capturing user input, including, but not limited to, a keyboard or keypad, a touchscreen component (e.g., touchscreen display), a receiver (e.g., an RF or infrared receiver), motion sensors, and/or one or more input buttons.
I/O module 2208 may include one or more devices for presenting output to a user, including, but not limited to, a graphics engine, a display (e.g., a display screen), one or more output drivers (e.g., display drivers), one or more audio speakers, and one or more audio drivers. In certain embodiments, I/O module 2208 is configured to provide graphical data to a display for presentation to a user. The graphical data may be representative of one or more graphical user interfaces and/or any other graphical content as may serve a particular implementation.
In the preceding description, various exemplary embodiments have been described with reference to the accompanying drawings. It will, however, be evident that various modifications and changes may be made thereto, and additional embodiments may be implemented, without departing from the scope of the invention as set forth in the claims that follow. For example, certain features of one embodiment described herein may be combined with or substituted for features of another embodiment described herein. The description and drawings are accordingly to be regarded in an illustrative rather than a restrictive sense.
Claims
1. A system comprising:
- a memory storing instructions; and
- a processor communicatively coupled to the memory and configured to execute the instructions to: obtain anatomical characteristic data representative of a characteristic associated with an anatomical surface to be covered by a physical medical element, the anatomical surface within an internal space of a patient; and determine, based on the anatomical characteristic data, a placement guidance parameter set, the placement guidance parameter set comprising one or more parameters configured to specify a suturing plan for suturing the physical medical element to the anatomical surface while one or more surgical instruments controlled by a computer-assisted surgical system hold the physical medical element in place on the anatomical surface.
2. The system of claim 1, wherein:
- the processor is further configured to execute the instructions to obtain element characteristic data representative of a characteristic associated with the physical medical element; and
- the determining of the placement guidance parameter set is further based on the element characteristic data.
3. The system of claim 2, wherein the characteristic associated with the physical medical element as represented by the element characteristic data comprises at least one of physical dimensions of the physical medical element, a type of material used for the physical medical element, a tensile strength of the physical medical element, or a deformation characteristic of the physical medical element.
4. The system of claim 1, wherein the characteristic associated with the anatomical surface to be covered by the physical medical element as represented by the anatomical characteristic data comprises at least one of a type of tissue that constitutes the anatomical surface, a size of the anatomical surface, or a location of the anatomical surface within the patient.
5. The system of claim 1, wherein the placement guidance parameter set further comprises one or more parameters configured to specify an optimal orientation for the physical medical element as the physical medical element is being placed on the anatomical surface with the one or more surgical instruments controlled by the computer-assisted surgical system.
6. The system of claim 5, wherein the processor is further configured to execute the instructions to graphically indicate, within an image of the internal space, the optimal orientation for the physical medical element as the physical medical element is being placed on the anatomical surface with the one or more surgical instruments controlled by the computer-assisted surgical system.
7. The system of claim 1, wherein the placement guidance parameter set further comprises one or more parameters configured to specify an optimal insertion path within the internal space that the one or more surgical instruments are to follow while bringing the physical medical element from outside the patient to being in contact with the anatomical surface.
8. The system of claim 7, wherein the processor is further configured to execute the instructions to graphically indicate, within an image of the internal space, the optimal insertion path within the internal space that the one or more surgical instruments are to follow while bringing the physical medical element from outside the patient to being in contact with the anatomical surface.
9. The system of claim 1, wherein the processor is further configured to execute the instructions to graphically indicate, within an image of the internal space, the suturing plan.
10. The system of claim 1, wherein the suturing plan comprises information specifying at least one of a suture pattern to be used to suture the physical medical element to the anatomical surface, a spacing to be used between sutures used to suture the physical medical element to the anatomical surface, a type of thread to be used to suture the physical medical element to the anatomical surface, and a length of thread needed to suture the physical medical element to the anatomical surface.
11. The system of claim 1, wherein the processor is further configured to execute the instructions to direct, in accordance with the placement guidance parameter set, the computer-assisted surgical system to use the one or more surgical instruments to automatically place the physical medical element on the anatomical surface without a user providing user input that controls a movement of the one or more surgical instrument while the physical medical element is being placed on the anatomical surface by the one or more surgical instruments.
12. The system of claim 1, wherein the obtaining of the anatomical characteristic data comprises:
- accessing image data representative of an image acquired by an imaging device and depicting the internal space of the patient;
- obtaining depth data representative of a depth map for the internal space depicted in the image acquired by the imaging device; and
- determining, based on the image data and the depth data, the anatomical characteristic data.
13. The system of claim 12, wherein the determining of the anatomical characteristic data based on the image data and the depth data comprises:
- segmenting, based on the image data and the depth data, the image; and
- identifying the anatomical surface to be covered by the physical medical element based on the segmenting.
14. The system of claim 1, wherein the processor is further configured to execute the instructions to:
- instruct a display device to render a virtual medical element representative of the physical medical element over a target region within an image of the internal space of the patient, the target region depicting the anatomical surface to be covered by the physical medical element;
- receive, while the virtual medical element is rendered over the target region, user input that sets at least one of a pose of the virtual medical element within the image and a size of the virtual medical element; and
- determine, based on the user input and on depth data representative of a depth map for the internal space, physical dimensions for the physical medical element, the physical dimensions defining a surface area for the physical medical element;
- wherein the determining of the placement guidance parameter set is further based on the physical dimensions defining the surface area for the physical medical element.
15. The system of claim 14, wherein:
- the processor is further configured to execute the instructions to access deformation data representative of a deformation model of the physical medical element; and
- the determining of the physical dimensions for the physical medical element is further based on the deformation data.
16. A method comprising:
- obtaining, by a medical element management system, anatomical characteristic data representative of a characteristic associated with an anatomical surface to be covered by a physical medical element, the anatomical surface within an internal space of a patient; and
- determining, by the medical element management system based on the anatomical characteristic data, a placement guidance parameter set, the placement guidance parameter set comprising one or more parameters configured to specify a suturing plan for suturing the physical medical element to the anatomical surface while one or more surgical instruments controlled by a computer-assisted surgical system hold the physical medical element in place on the anatomical surface.
17. The method of claim 16, further comprising:
- instructing, by the medical element management system, a display device to render a virtual medical element representative of the physical medical element over a target region within an image of the internal space of the patient, the target region depicting the anatomical surface to be covered by the physical medical element;
- receiving, by the medical element management system while the virtual medical element is rendered over the target region, user input that sets at least one of a pose of the virtual medical element within the image and a size of the virtual medical element; and
- determining, by the medical element management system based on the user input and on depth data representative of a depth map for the internal space, physical dimensions for the physical medical element, the physical dimensions defining a surface area for the physical medical element;
- wherein the determining of the placement guidance parameter set is further based on the physical dimensions defining the surface area for the physical medical element.
18. The method of claim 17, further comprising:
- accessing, by the medical element management system, deformation data representative of a deformation model of the physical medical element;
- wherein the determining of the physical dimensions for the physical medical element is further based on the deformation data.
19. A non-transitory computer-readable medium storing instructions that, when executed, direct a processor of a computing device to:
- obtain anatomical characteristic data representative of a characteristic associated with an anatomical surface to be covered by a physical medical element, the anatomical surface within an internal space of a patient; and
- determine, based on the anatomical characteristic data, a placement guidance parameter set, the placement guidance parameter set comprising one or more parameters configured to specify a suturing plan for suturing the physical medical element to the anatomical surface while one or more surgical instruments controlled by a computer-assisted surgical system hold the physical medical element in place on the anatomical surface.
20. The method of claim 16, further comprising:
- obtaining, by the medical element management system, element characteristic data representative of a characteristic associated with the physical medical element;
- wherein the determining of the placement guidance parameter set is further based on the element characteristic data.
2671444 | March 1954 | Pease, Jr. et al. |
20050251026 | November 10, 2005 | Stone |
20060220325 | October 5, 2006 | McFarlane |
20150150457 | June 4, 2015 | Wu et al. |
20150223765 | August 13, 2015 | Chopra |
20180325610 | November 15, 2018 | Cameron |
20190000561 | January 3, 2019 | Decker |
20190369717 | December 5, 2019 | Frielinghaus |
20200211692 | July 2, 2020 | Kalafut |
20200281667 | September 10, 2020 | Blondel |
20210223855 | July 22, 2021 | Gibby |
20220183759 | June 16, 2022 | Storch |
20230005596 | January 5, 2023 | Hamilton |
- International Preliminary Report on Patentability for Application No. PCT/US2020/054297, mailed Apr. 21, 2022, 13 pages.
- International Search Report and Written Opinion for Application No. PCT/US2020/054297, mailed on Mar. 12, 2021, 17 pages.
- Invitation to Pay Additional Fees, Communication Relating to the Results of the Partial International Search mailed on Jan. 18, 2021 for PCT Application No. PCT/US2020/054297 filed on Oct. 5, 2020, 13 pages.
- Vertut, J, and Coiffet, P., “Robot Technology: Teleoperation and Robotics Evolution and Development,” English translation, Prentice-Hall, Inc., Inglewood Cliffs, NJ, USA 1986, vol. 3A, 332 pages.
Type: Grant
Filed: Oct 5, 2020
Date of Patent: Jan 28, 2025
Patent Publication Number: 20220361952
Assignee: Intuitive Surgical Operations, Inc. (Sunnyvale, CA)
Inventors: Azad Shademan (Campbell, CA), Mahdi Azizian (San Jose, CA), Daniel Proksch (San Jose, CA), Pourya Shirazian (Menlo Park, CA)
Primary Examiner: Santiago Garcia
Application Number: 17/762,518
International Classification: A61B 34/10 (20160101); A61B 5/107 (20060101); A61B 34/00 (20160101); A61B 34/32 (20160101); G06T 7/11 (20170101); G06T 7/50 (20170101); G06T 19/00 (20110101); G06T 19/20 (20110101); G16H 20/40 (20180101);