SYSTEMS FOR FACILITATING GUIDED TELEOPERATION OF A NON-ROBOTIC DEVICE IN A SURGICAL SPACE

An exemplary system includes a memory storing instructions and a processor communicatively coupled to the memory. The processor may be configured to execute the instructions to obtain one or more parameters of a non-robotic device in a surgical space, the non-robotic device engaged by a computer-assisted surgical system; generate, based on at least the one or more parameters of the non-robotic device, guidance content for use by the computer-assisted surgical system to facilitate guided teleoperation of the non-robotic device; and provide the guidance content to the computer-assisted surgical system.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
RELATED APPLICATIONS

The present application claims priority to U.S. Provisional Patent Application No. 62/986,373, filed on Mar. 6, 2020, and entitled “SYSTEMS AND METHODS FOR FACILITATING GUIDED TELEOPERATION OF A NON-ROBOTIC DEVICE IN A SURGICAL SPACE”, U.S. Provisional Patent Application No. 62/959,635, filed on Jan. 10, 2020, and entitled “SYSTEMS AND METHODS FOR FACILITATING A ROBOTIC INSTRUMENT INTERACTING WITH A TARGET OBJECT IN A SURGICAL SPACE”, and to U.S. Provisional Patent Application No. 62/948,542, filed on Dec. 16, 2019, and entitled “SYSTEMS AND METHODS FOR IDENTIFYING AND FACILITATING AN INTENDED INTERACTION WITH A TARGET OBJECT IN A SURGICAL SPACE”, the contents of which are hereby incorporated by reference in their entirety.

BACKGROUND INFORMATION

A computer-assisted surgical system that employs robotic and/or teleoperation technology typically includes a stereoscopic image viewer configured to provide, for display to a surgeon, imagery of a surgical space as captured by an imaging device such as an endoscope. While the surgeon's eyes are positioned in front of viewing lenses of the stereoscopic image viewer, the surgeon may view the imagery of the surgical space while remotely manipulating one or more surgical instruments located within the surgical space. The surgical instruments are attached to one or more manipulator arms of a surgical instrument manipulating system included as part of the computer-assisted surgical system.

In addition to the surgical instruments that are attached to the one or more manipulator arms, additional instruments may be inserted into the surgical space to facilitate the surgeon performing procedures within the surgical space. For example, sub-surface sensing devices (e.g., ultrasound devices) may be provided within the surgical space to improve the surgeon's perception of the surgical space and improve an outcome of a procedure. However, such additional instruments are not typically integrated into a module that attaches to a manipulator arm of a computer-assisted surgical system. In view of this, such additional instruments may only be available as drop-in instruments that rely on, for example, a grasper surgical instrument attached to a manipulator arm of a computer-assisted surgical system to grasp and move the drop-in instruments within the surgical space. Operation of a teleoperated grasper surgical instrument to interact with a drop-in instrument requires a surgeon to perform complex maneuvers to pick up and use the drop-in instrument within the surgical space.

SUMMARY

An exemplary system comprises a memory storing instructions; and a processor communicatively coupled to the memory and configured to execute the instructions to: obtain one or more parameters of a non-robotic device in a surgical space, the non-robotic device engaged by a computer-assisted surgical system; generate, based on at least the one or more parameters of the non-robotic device, guidance content for use by the computer-assisted surgical system to facilitate guided teleoperation of the non-robotic device; and provide the guidance content to the computer-assisted surgical system.

An additional exemplary system comprises a memory storing instructions; and a processor communicatively coupled to the memory and configured to execute the instructions to: obtain one or more parameters of an object in a surgical space; obtain one or more parameters of a non-robotic imaging device in a surgical space, the non-robotic device engaged by a computer-assisted surgical system; generate, based on at least the one or more parameters of the object and the one or more parameters of the non-robotic imaging device, guidance content for use by the computer-assisted surgical system to facilitate guided teleoperation of the non-robotic imaging device to capture imagery of the object in the surgical space; and provide the guidance content to the computer-assisted surgical system.

An exemplary method comprises obtaining, by a processor associated with a computer-assisted surgical system, one or more parameters of a non-robotic device in a surgical space, the non-robotic device engaged by the computer-assisted surgical system; generating, by the processor and based on at least the one or more parameters of the non-robotic device, guidance content for use by the computer-assisted surgical system to facilitate guided teleoperation of the non-robotic device; and providing, by the processor, the guidance content to the computer-assisted surgical system.

BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings illustrate various embodiments and are a part of the specification. The illustrated embodiments are merely examples and do not limit the scope of the disclosure. Throughout the drawings, identical or similar reference numbers designate identical or similar elements.

FIG. 1 illustrates an exemplary computer-assisted surgical system according to principles described herein.

FIG. 2 illustrates an exemplary view of a surgical space according to principles described herein.

FIG. 3 illustrates an exemplary system configured to facilitate guided teleoperation of a non-robotic device in a surgical space according to principles described herein.

FIG. 4 illustrates an exemplary image of a surgical space according to principles described herein.

FIG. 5 illustrates an exemplary flow chart depicting various operations that may be performed by the system illustrated in FIG. 3 according to principles described herein.

FIGS. 6A-9 illustrate additional exemplary images of a surgical space according to principles described herein.

FIG. 10 illustrates an additional exemplary flow chart depicting various operations that may be performed by the system illustrated in FIG. 3 according to principles described herein.

FIG. 11 illustrates an additional exemplary image of a surgical space according to principles described herein.

FIG. 12 illustrates an exemplary method for facilitating guided teleoperation of a non-robotic device in a surgical space according to principles described herein.

FIG. 13 illustrates an exemplary computing device according to principles described herein.

DETAILED DESCRIPTION

Systems and methods for facilitating guided teleoperation of a non-robotic device in a surgical space are described herein. As will be described in more detail below, an exemplary system includes a memory that stores instructions and a processor communicatively connected to the memory. The processor of the exemplary system is configured to execute the instructions to obtain one or more parameters of a non-robotic device in a surgical space, the non-robotic device engaged by a computer-assisted surgical system; generate, based on at least the one or more parameters of the non-robotic device, guidance content for use by the computer-assisted surgical system to facilitate guided teleoperation of the non-robotic device; and provide the guidance content to the computer-assisted surgical system.

Various advantages and benefits are associated with systems and methods described herein. For example, systems and methods such as those described herein may reduce the mental and/or physical workload required for a user of a computer-assisted surgical system (e.g., a surgeon and/or another user associated with a computer-assisted surgical system) to use (e.g., teleoperate) a robotic instrument to interact with a non-robotic device (e.g., a non-robotic instrument) located in a surgical space, such as by the systems and methods facilitating using the robotic instrument to move the non-robotic device in the surgical space. In so doing, systems and methods such as those described herein may simplify procedures performed within the surgical space and/or improve usability of a computer-assisted surgical system. These and other benefits that may be realized by the systems and methods described herein will be evident from the disclosure that follows.

Exemplary systems described herein may be configured to operate as part of or in conjunction with a plurality of different types of computer-assisted surgical systems. The different types of computer-assisted surgical systems may include any type of computer-assisted surgical system as may serve a particular implementation. In certain examples, a type of computer-assisted surgical system may include a system in which one or more surgical devices (e.g., surgical instruments) are manually (e.g., laparoscopically) controlled by a user. In certain examples, a type of computer-assisted surgical system may include a robotic surgical system configured to facilitate operation one or more smart instruments (e.g., smart sub-surface imaging devices) that may be manually and/or robotically controlled by a user. In certain implementations, the plurality of different types of computer-assisted surgical systems may be of different types at least because they include different types of surgical instrument manipulating systems. For example, a first computer-assisted surgical system may include a first type of surgical instrument manipulating system, a second computer-assisted surgical system may include a second type of surgical instrument manipulating system, and a third computer-assisted surgical system may include a third type of surgical instrument manipulating system.

Each type of surgical instrument manipulating system may have a different architecture (e.g., a manipulator arm architecture), have a different kinematic profile, and/or operate according to different configuration parameters. An exemplary computer-assisted surgical system with a first type of surgical instrument manipulating system will now be described with reference to FIG. 1. The described exemplary computer-assisted surgical system is illustrative and not limiting. Systems such as those described herein may operate as part of or in conjunction with the described computer-assisted surgical system and/or any other suitable computer-assisted surgical system.

FIG. 1 illustrates an exemplary computer-assisted surgical system 100 (“surgical system 100”). As shown, surgical system 100 may include a surgical instrument manipulating system 102 (“manipulating system 102”), a user control system 104, and an auxiliary system 106 communicatively coupled one to another.

Surgical system 100 may be utilized by a surgical team to perform a computer-assisted surgical procedure on a patient 108. As shown, the surgical team may include a surgeon 110-1, an assistant 110-2, a nurse 110-3, and an anesthesiologist 110-4, all of whom may be collectively referred to as “surgical team members 110.” Additional or alternative surgical team members may be present during a surgical session as may serve a particular implementation.

While FIG. 1 illustrates an ongoing minimally invasive surgical procedure, surgical system 100 may similarly be used to perform open surgical procedures or other types of surgical procedures that may similarly benefit from the accuracy and convenience of surgical system 100. Additionally, it will be understood that the surgical session throughout which surgical system 100 may be employed may not only include an operative phase of a surgical procedure, as is illustrated in FIG. 1, but may also include preoperative, postoperative, and/or other suitable phases of the surgical procedure. A surgical procedure may include any procedure in which manual and/or instrumental techniques (e.g., teleoperated instrumental techniques) are used on a patient to investigate, diagnose, or treat a physical condition of the patient. Additionally, a surgical procedure may include any procedure that is not performed on a live patient, such as a calibration procedure, a simulated training procedure, and an experimental or research procedure.

As shown in FIG. 1, surgical instrument manipulating system 102 may include a plurality of manipulator arms 112 (e.g., manipulator arms 112-1 through 112-4) to which a plurality of robotic surgical instruments (“robotic instruments”) (not shown) may be coupled. As used herein, a “robotic instrument” refers to any instrument that may be directly attached to (e.g., plugged into, fixedly coupled to, mated to, etc.) a manipulator arm (e.g., manipulator arm 112-1) such that movement of the manipulator arm directly causes movement of the instrument. Each robotic instrument may be implemented by any suitable therapeutic instrument (e.g., a tool having tissue-interaction functions), imaging device (e.g., an endoscope), diagnostic instrument, or the like that may be used for a computer-assisted surgical procedure (e.g., by being at least partially inserted into patient 108 and manipulated to perform a computer-assisted surgical procedure on patient 108). In some examples, one or more of the robotic instruments may include force-sensing and/or other sensing capabilities.

In the example shown in FIG. 1, manipulator arms 112 of manipulating system 102 are attached on a distal end of an overhead boom that extends horizontally. However, manipulator arms 112 may have other configurations in certain implementations. In addition, while manipulating system 102 is depicted and described herein as including four manipulator arms 112, it will be recognized that manipulating system 102 may include only a single manipulator arm 112 or any other number of manipulator arms as may serve a particular implementation.

Manipulator arms 112 and/or robotic instruments attached to manipulator arms 112 may include one or more displacement transducers, orientational sensors, and/or positional sensors (hereinafter “surgical system sensors”) used to generate raw (e.g., uncorrected) kinematics information. One or more components of surgical system 100 may be configured to use the kinematics information to track (e.g., determine positions of) and/or control the robotic instruments.

In addition, manipulator arms 112 may each include or otherwise be associated with a plurality of motors that control movement of manipulator arms 112 and/or the surgical instruments attached thereto. For example, manipulator arm 112-1 may include or otherwise be associated with a first internal motor (not explicitly shown) configured to yaw manipulator arm 112-1 about a yaw axis. In like manner, manipulator arm 112-1 may be associated with a second internal motor (not explicitly shown) configured to drive and pitch manipulator arm 112-1 about a pitch axis. Likewise, manipulator arm 112-1 may be associated with a third internal motor (not explicitly shown) configured to slide manipulator arm 112-1 along insertion axis. Manipulator arms 112 may each include a drive train system driven by one or more of these motors in order to control the pivoting of manipulator arms 112 in any manner as may serve a particular implementation. As such, if a robotic instrument attached, for example, to manipulator arm 112-1 is to be mechanically moved, one or more of the motors coupled to the drive train may be energized to move manipulator arm 112-1.

Robotic instruments attached to manipulator arms 112 may each be positioned in a surgical space. A “surgical space” may, in certain examples, be entirely disposed within a patient and may include an area within the patient at or near where a surgical procedure is planned to be performed, is being performed, or has been performed. For example, for a minimally invasive surgical procedure being performed on tissue internal to a patient, the surgical space may include the tissue, anatomy underlying the tissue, as well as space around the tissue where, for example, robotic instruments and/or other instruments being used to perform the surgical procedure are located. In other examples, a surgical space may be at least partially disposed external to the patient at or near where a surgical procedure is planned to be performed, is being performed, or has been performed on the patient. For instance, surgical system 100 may be used to perform an open surgical procedure such that part of the surgical space (e.g., tissue being operated on) is internal to the patient while another part of the surgical space (e.g., a space around the tissue where one or more instruments may be disposed) is external to the patient. A robotic instrument may be referred to as being positioned or located at or within a surgical space when at least a portion of the robotic instrument (e.g., a distal portion of the robotic instrument) is located within the surgical space. Exemplary surgical spaces and/or images of surgical spaces will be described herein.

User control system 104 may be configured to facilitate control by surgeon 110-1 of manipulator arms 112 and robotic instruments attached to manipulator arms 112. For example, surgeon 110-1 may interact with user control system 104 to remotely move, manipulate, or otherwise teleoperate manipulator arms 112 and the robotic instruments. To this end, user control system 104 may provide surgeon 110-1 with imagery (e.g., high-definition three-dimensional (3D) imagery) of a surgical space associated with patient 108 as captured by an imaging device. In certain examples, user control system 104 may include a stereoscopic image viewer having two displays where stereoscopic images (e.g., 3D images) of a surgical space associated with patient 108 and generated by a stereoscopic imaging system may be viewed by surgeon 110-1. Surgeon 110-1 may utilize the imagery to perform one or more procedures with one or more robotic instruments attached to manipulator arms 112.

To facilitate control of robotic instruments, user control system 104 may include a set of master controls (not shown). These master controls may be manipulated by surgeon 110-1 to control movement of robotic instruments (e.g., by utilizing robotic and/or teleoperation technology). The master controls may be configured to detect a wide variety of hand, wrist, and finger movements by surgeon 110-1. In this manner, surgeon 110-1 may intuitively perform a surgical procedure using one or more robotic instruments.

User control system 104 may further be configured to facilitate control by surgeon 110-1 of other components of surgical system 100. For example, surgeon 110-1 may interact with user control system 104 to change a configuration or operating mode of surgical system 100, to change a display mode of surgical system 100, to generate additional control signals used to control surgical instruments attached to manipulator arms 112, to facilitate switching control from one robotic instrument to another, to facilitate interaction with other instruments and/or objects within the surgical space, or to perform any other suitable operation. To this end, user control system 104 may also include one or more input devices (e.g., foot pedals, buttons, switches, etc.) configured to receive input from surgeon 110-1.

Auxiliary system 106 may include one or more computing devices configured to perform primary processing operations of surgical system 100. The one or more computing devices included in auxiliary system 106 may control and/or coordinate operations performed by various other components (e.g., manipulating system 102 and/or user control system 104) of surgical system 100. For example, a computing device included in user control system 104 may transmit instructions to manipulating system 102 by way of the one or more computing devices included in auxiliary system 106. As another example, auxiliary system 106 may receive, from manipulating system 102, and process image data representative of imagery captured by an imaging device attached to one of manipulator arms 112.

In some examples, auxiliary system 106 may be configured to present visual content to surgical team members 110 who may not have access to the images provided to surgeon 110-1 at user control system 104. To this end, auxiliary system 106 may include a display monitor 114 configured to display one or more user interfaces, such as images (e.g., 2D images) of the surgical space, information associated with patient 108 and/or the surgical procedure, and/or any other visual content as may serve a particular implementation. For example, display monitor 114 may display images of the surgical space together with additional content (e.g., representations of target objects, graphical content, contextual information, etc.) concurrently displayed with the images. In some embodiments, display monitor 114 is implemented by a touchscreen display with which surgical team members 110 may interact (e.g., by way of touch gestures) to provide user input to surgical system 100.

Manipulating system 102, user control system 104, and auxiliary system 106 may be communicatively coupled one to another in any suitable manner. For example, as shown in FIG. 1, manipulating system 102, user control system 104, and auxiliary system 106 may be communicatively coupled by way of control lines 116, which may represent any wired or wireless communication link as may serve a particular implementation. To this end, manipulating system 102, user control system 104, and auxiliary system 106 may each include one or more wired or wireless communication interfaces, such as one or more local area network interfaces, Wi-Fi network interfaces, cellular interfaces, etc.

FIG. 2 illustrates a view 200 of a surgical space in which various robotic instruments are attached to manipulator arms 112 of surgical system 100. As shown, the robotic instruments may include an imaging device 202 and one or more other robotic instruments 204 (e.g., robotic instruments 204-1 through 204-3) in the form of one or more surgical tools. While FIG. 2 shows one imaging device 202 and three other robotic instruments 204 located at the surgical space, any number, type, and/or combination of robotic instruments may be at the surgical space during a surgical procedure. In the example shown in FIG. 2, robotic instruments 204-1 and 204-3 are shown as grasping-type robotic instruments whereas robotic instrument 204-2 is shown as a cutting-type robotic instrument. It is understood that other types of robotic instruments (e.g., diagnostic tools, therapeutic tools, etc.) different than those shown in FIG. 2 may additionally or alternatively be provided within the surgical space during the surgical procedure in certain implementations. Tissue 206 represents anatomical tissue at the surgical space.

Imaging device 202 may capture imagery at the surgical space. Any of robotic instruments 204 and/or tissue 206 that are within a field of view of imaging device 202 may be depicted in the imagery captured by imaging device 202.

Imaging device 202 may provide data representing visible light data of a surgical space. For example, imaging device 202 may capture visible light images of the surgical space that represent visible light sensed by imaging device 202. Visible light images may include images that use any suitable color and/or grayscale palette to represent a visible light-based view of the surgical space.

Imaging device 202 may also provide data representing depth data of a surgical space or data that may be processed to derive depth data of the surgical space. For example, imaging device 202 may capture images of the surgical space that represent depth sensed by imaging device 202. Alternatively, imaging device 202 may capture images of the surgical space that may be processed to derive depth data of the surgical space. The depth information may be represented as depth images (e.g., depth map images obtained using a Z-buffer that indicates distance from imaging device 202 to each pixel point on an image of a surgical space), which may be configured to visually indicate depths of objects in the surgical space in any suitable way, such as by using different greyscale values to represent different depth values. Images captured by an imaging device (e.g., by imaging device 202) and/or derived from images captured by the imaging device (e.g., visible light images and depth images) may be used to facilitate detecting a robotic instrument (e.g., robotic instruments 204-1 through 204-3) and/or one or more objects within a surgical space, such as described herein.

During a surgical procedure, it may be desirable to use a robotic instrument (e.g., robotic instrument 204-1 or 204-3) to teleoperate a non-robotic device located within a surgical space. As used herein, a “non-robotic device” refers to any suitable device or instrument that may be provided within a surgical space but that is not directly attached to one of manipulator arms 112. As such, a non-robotic device may only be movable within a surgical space by either being manually manipulated by a user (e.g., surgeon 110-1, assistant 110-2, etc.) or by being moved by a user through teleoperation of a robotic instrument directly attached to one of manipulator arms (e.g., by being grasped or otherwise engaged by robotic instrument 204-3). In this regard, “teleoperation of a non-robotic device” as described herein refers to the indirect teleoperation of a non-robotic device by way of a robotic instrument attached to a computer-assisted surgical system. As such, a non-robotic device may be referred to as a drop-in surgical instrument/device. Examples of non-robotic devices may include, but are not limited to, a non-robotic imaging device (e.g., a drop-in ultrasound probe, a drop-in optical coherence tomography (“OCT”) probe, a drop-in rapid evaporative ionization mass spectrometry (“REIMS”) device), a suture needle, and/or any other suitable instrument or device.

Non-robotic devices such as those described herein may be configured to be engaged by a computer-assisted surgical system in any suitable manner. In certain examples, a non-robotic device may be engaged by a computer-assisted surgical system by being communicatively coupled, in any suitable manner, to the computer-assisted surgical system. In certain examples, a non-robotic device may be engaged by a component of a computer-assisted surgical system and/or by a component attached to a computer-assisted surgical system, such as a robotic instrument (e.g., robotic instrument 204-1 or 204-3) attached to a computer-assisted surgical system. This may be accomplished in any suitable manner. For example, in certain implementations, a non-robotic device may be configured to be grasped by a grasper robotic instrument. To that end, in certain examples, a non-robotic device may include one or more graspable portions (e.g., protrusions, loops, etc.) that a robotic instrument may grasp to facilitate user teleoperation of the non-robotic device. In certain alternative implementations, a non-robotic device may be engaged by a robotic instrument without specifically being grasped by the robotic instrument. For example, in certain implementations, a non-robotic device may include one or more engagement portions that are specifically configured to engage with a corresponding engagement portion of a robotic instrument. To illustrate an example, a non-robotic device may include a recess that is keyed to receive a corresponding keyed protrusion provided on a specialized robotic instrument. In such an example, the robotic instrument may be locked into place with respect to the non-robotic device when the keyed protrusion provided of robotic instrument is inserted within the keyed recess of the non-robotic device. In certain alternative implementations, a keyed recess may be provided on the robotic instrument and a corresponding keyed protrusion may be provided on the non-robotic device. Exemplary non-robotic devices will be described further herein.

FIG. 3 illustrates an exemplary system 300 that may be implemented according to principles described herein to facilitate guided teleoperation of a non-robotic device located in a surgical space. As shown, system 300 may include, without limitation, a processing facility 302 and a storage facility 304 selectively and communicatively coupled to one another. Facilities 302 and 304 may each include or be implemented by hardware and/or software components (e.g., processors, memories, communication interfaces, instructions stored in memory for execution by the processors, etc.). In some examples, facilities 302 and 304 may be implemented by a single device (e.g., a single computing device). In certain alternate examples, facilities 302 and 304 may be distributed between multiple devices and/or multiple locations as may serve a particular implementation.

Storage facility 304 may maintain (e.g., store) executable data used by processing facility 302 to perform any of the operations described herein. For example, storage facility 304 may store instructions 306 that may be executed by processing facility 302 to perform any of the operations described herein. Instructions 306 may be implemented by any suitable application, software, code, and/or other executable data instance.

Storage facility 304 may also maintain any data received, generated, managed, used, and/or transmitted by processing facility 302. For example, storage facility 304 may maintain any suitable data associated with facilitating teleoperation of a non-robotic device in a surgical space. Such data may include, but is not limited to, data associated with potential objects in a surgical space that a non-robotic device may interact with, three dimensional models of objects that may be located in a surgical space, depth map information associated with a surgical space, pose information associated with non-robotic devices, robotic instruments, and/or additional objects located in a surgical space, data indicating a procedural context of a surgical session, imagery of a surgical space, kinematics data for robotic instruments and/or manipulator arms, data defining guidance content associated with a non-robotic device, user interface content (e.g., graphical objects, notifications, etc.), and/or any other suitable data.

Processing facility 302 may be configured to perform (e.g., execute instructions 306 stored in storage facility 304) various processing operations associated with facilitating guided teleoperation of a non-robotic device in a surgical space. For example, processing facility 302 may obtain one or more parameters of a non-robotic device in a surgical space, the non-robotic device engaged by a computer-assisted surgical system; generate, based on at least the one or more parameters of the non-robotic device, guidance content for use by the computer-assisted surgical system to facilitate guided teleoperation of the non-robotic device; and provide the guidance content to the computer-assisted surgical system. These and other operations that may be performed by processing facility 302 are described herein.

In certain examples, at any given time during a surgical procedure associated with a surgical space, it may be desirable for a user of a computer-assisted surgical system to use a robotic instrument to teleoperate a non-robotic device that is engaged with (e.g., grasped by) the robotic instrument. To facilitate use of such a robotic instrument to teleoperate a non-robotic device, system 300 (e.g., processing facility 302) may obtain one or more parameters associated with a surgical space. For example, system 300 may obtain one or more parameters of a non-robotic device to facilitate teleoperation of the non-robotic device. The one or more parameters of a non-robotic device may include any suitable information associated with the non-robotic device that may be used to facilitate guided teleoperation of the non-robotic device. For example, parameters of a non-robotic device may include, but are not limited to, a type of the non-robotic device, a pose of the non-robotic device, a position of the non-robotic device, an orientation of the non-robotic device, one or more operating parameters of the non-robotic device (e.g., operational settings of the non-robotic device), a procedural context associated with the non-robotic device, a 3D model of the non-robotic device, a contact state of the non-robotic device with respect to an object (e.g., tissue) in a surgical space, and/or any other suitable information.

System 300 may obtain the one or more parameters of a non-robotic device in any suitable manner. For example, in certain implementations at least some parameters may be accessed from a storage device (e.g., storage facility 304) associated with a computer-assisted surgical system (e.g., surgical system 100). For example, system 300 may access a 3D model of a non-robotic device, information identifying the type of non-robotic device, information identifying operating parameters of the non-robotic device, information identifying engagement portions on an non-robotic device, and/or any other suitable information from the storage device.

To illustrate an example, FIG. 4 shows an image 400 of an exemplary non-robotic device that may be teleoperated during a surgical procedure in a surgical space according to principles described herein. As shown in FIG. 4, image 400 illustrates a surgical space in which a non-robotic device 402 and robotic instruments 204-1 through 204-3 are provided in relation to a kidney 404 of a patient (e.g., patient 108). As shown in FIG. 4, non-robotic device 402 includes a protrusion 406 that is grasped by robotic instrument 204-1. As such, teleoperation of robotic instrument 204-1 by a user (e.g., surgeon 110-1 by manipulating master controls of user control system 104) results in teleoperation of non-robotic device 402.

In the example shown in FIG. 4, non-robotic device 402 may be configured to perform or facilitate performance of a surgical procedure with respect to kidney 404. Accordingly, the one or more parameters accessed by system 300 associated with non-robotic device 402 may include a 3D model of non-robotic device 402, information specifying the configuration of protrusion 406, operational parameters of non-robotic device 402, optimal pose information of non-robotic device 402 with respect to kidney 404, etc.

Additionally or alternatively, system 300 may obtain the one or more parameters of a non-robotic device by determining (e.g., detecting, deriving, etc.) the one or more parameters. For example, system 300 may determine, in any suitable manner such as described herein, a pose of the non-robotic device in the surgical space. As used herein, a “pose” of an object such as a non-robotic device refers to the combination of the position of the object and the orientation of the object in a space such as a surgical space. The pose may be referred to as a six-dimension (6D) pose because there are three degrees of freedom associated with the position of an object and three degrees of freedom associated with the orientation of the object.

System 300 may determine the pose of a non-robotic device in any suitable manner. For example, as will be described herein, a non-robotic device may be engaged with (e.g., grasped by) a robotic instrument. Accordingly, the pose of a non-robotic device may be determined based on kinematics information associated with the robotic instrument that is engaged with the non-robotic device. Additionally or alternatively, the pose of a non-robotic device may be determined based on depth data, image data, a determined orientation of the robotic instrument, and/or some combination thereof.

Additionally or alternatively to obtaining parameters of a non-robotic device, system 300 may obtain one or more parameters associated with an object in a surgical space to facilitate use of a robotic instrument to teleoperate a non-robotic device in the surgical space. As used herein, an “object” located in a surgical space may include any anatomy (e.g., tissue, bone, etc.) or other object that may be in a surgical space. In certain examples, an object may correspond to any object in a surgical space that may be the subject of a surgical procedure to be performed in a surgical space. For example, an object may be the subject of a measurement procedure, a diagnostic procedure, an imaging procedure (e.g., a subsurface imaging procedure), a suturing procedure, a tissue tensioning procedure, a cutting procedure, a suction procedure, an irrigation procedure, a therapeutic procedure, and/or any other suitable procedure that may be performed in a surgical space.

The parameters associated with an object in the surgical space may include any suitable information associated with the object that may facilitate a non-robotic device operating (e.g., performing a surgical procedure) with respect to the object. For example, the parameters associated with an object in the surgical space may include a position, an orientation, or a pose of the object in the surgical space, a 3D model of the object, an intended interaction with the object during a surgical procedure, a determined shape/size of the object, and/or any other suitable information.

System 300 may be configured to obtain one or more parameters associated with an object in the surgical space in any suitable manner. For example, in certain implementations, system 300 may access the one or more parameters associated with the object from a storage device (e.g., storage facility 304) associated with a computer-assisted surgical system (e.g., surgical system 100). Alternatively, system 300 may be configured to determine one or more parameters associated with the object in the surgical space. For, example, in certain implementations, system 300 may leverage a combination of depth data (e.g., provided in a depth map of a surgical space) and a determined orientation of the object within in the surgical space to determine the pose of the object in the surgical space. Exemplary ways that system 300 may determine a pose of an object are described herein.

In certain examples, parameters associated with a surgical space may include a procedural context associated with the surgical space. System 300 may determine the procedural context associated with a surgical space based on any suitable information or combination of information associated with the surgical space. For example, system 300 may detect the procedural context based on one or more images of the surgical space, user input indicating procedural context, a configuration of one or more robotic instruments (e.g., robotic instruments 204) located within the surgical space, a type of non-robotic device in the surgical space, kinematics of the one or more robotic instruments, and/or any other suitable information. To illustrate an example, based on the presence of non-robotic device 402 in relation to kidney 404 in FIG. 4, system 300 may determine that the procedural context associated with the surgical space in image 400 is associated with an imaging procedure to be performed with respect to kidney 404 using non-robotic device 402.

In certain examples, a non-robotic device may correspond to a non-robotic imaging device that is configured to capture imagery of an object in a surgical space. For example, non-robotic device 402 shown in FIG. 4 may correspond to a drop-in ultrasound probe that is configured to contact a surface of kidney 404 to capture ultrasound imagery of kidney 404. With ultrasound imaging, the quality of a captured ultrasound image depends on the amount of pressure that a drop-in ultrasound probe is pushed into tissue such as kidney 404. Too much pressure may negatively affect the quality of a captured ultrasound image. Similarly, not enough pressure may also negatively affect the quality of an ultrasound image. Accordingly, in such examples, the one or more parameters of the non-robotic imaging device may include operating parameters of the non-robotic imaging device that define image capture requirements (e.g., contact pressure requirements, contact angle requirements, etc.) of the non-robotic imaging device. In addition, in such examples, the one or more parameters of the non-robotic imaging device may include information indicating a current state of contact of a non-robotic imaging device with respect to the surface of tissue such as kidney 404 and/or a current contact angle of the non-robotic imaging device with respect to the tissue to facilitate the non-robotic imaging device capturing ultrasound imagery of the tissue.

The state of contact between a non-robotic imaging device and an object in a surgical space may include one of a full contact state, a partial contact state, or a no contact state. System 300 may be configured to determine a state of contact between a non-robotic imaging device and an object in a surgical space in any suitable manner. For example, in certain implementations, system 300 may monitor signal strength and/or other attributes of an image captured by a non-robotic imaging device to determine a state of contact. In certain examples, the signal strength of a captured image may be represented by image content in the captured image. Accordingly, in certain examples, system 300 may monitor the image content in imagery captured by a non-robotic imaging device to determine a state of contact of a non-robotic imaging device with respect to a surface of an object.

System 300 may monitor the image content in imagery captured by a non-robotic imaging device in any suitable manner. For example, system 300 may determine an amount of image content in a given image to determine a contact state of a non-robotic imaging device. If the amount of image content is above some predefined threshold, system 300 may determine that the contact state between the non-robotic imaging device and the tissue is acceptable. To illustrate, an image captured by the non-robotic imaging device may include a first region that includes image content (e.g., an image of a surface of a kidney) and a second region that does not include image content (e.g., a black region that represents air space adjacent to the kidney). System 300 may be configured to process such an image in any suitable manner and determine that the non-robotic imaging device is in substantially a full contact state if an area associated with the first region is above some predefined threshold. Alternatively, system 300 determine that the non-robotic imaging device is in a partial contact state if the area associated with the first region is below some predefined threshold.

In certain alternative examples, system 300 may compare an amount of image content in a previous image to an amount of image content in a current image captured by a non-robotic imaging device to determine a change in the contact state of a non-robotic imaging device with respect to an object in a surgical space.

To illustrate an example, FIG. 5 shows a flow chart 500 that depicts exemplary operations that may be performed by system 300 when the image content of a captured image is used to determine a contact state of a non-robotic imaging device. In operation 502, system 300 may capture an image of a surgical space using a non-robotic imaging device. System 300 may capture such an image at any suitable time. For example, in certain implementations, system 300 may use a non-robotic imaging device to continually capture imagery of the surgical space. Alternatively, system 300 may use a non-robotic imaging device to capture imagery when the non-robotic imaging device is within some predefined threshold distance of an object in the surgical space. Alternatively, system 300 may begin using a non-robotic imaging device to capture imagery based on a specific instruction provided by a user. For example, surgeon 110-1 may move non-robotic device 402 from the position indicated in FIG. 4 to a position at or near a surface of kidney 404. Surgeon 110-1 may then turn on or otherwise activate non-robotic device 402 to begin capturing imagery of kidney 404.

In operation 504, system 300 may detect a change in image content in an image captured by a non-robotic imaging device. This may be accomplished in any suitable manner. For example, system 300 may use any suitable image processing technique to determine an amount of image content in a captured image. If the amount of image content is more than some predefined threshold amount, system 300 may determine that there has been a change in the image content. In certain examples, system 300 may compare two or more images captured by a non-robotic imaging device to determine whether the relative amount of image content in the images has increased or decreased. To illustrate, FIGS. 6A and 6B show images 600 (e.g., images 600-1 and 600-2) of a surgical space in which non-robotic imaging instrument 402 is being used to capture ultrasound images 602 (e.g., ultrasound images 602-1 and 602-2) at different positions within the surgical space. As shown in FIG. 6A, non-robotic device 402 is positioned away from (i.e., not in contact with) kidney 404. As such, ultrasound image 602-1 is blank and does not include any subsurface image of kidney 404. On the other hand, in FIG. 6B, non-robotic device 402 is positioned near kidney 404 such that ultrasound image 602-2 includes a subsurface image of kidney 404. The change in image content between ultrasound image 602-1 and 602-2 may indicate that non-robotic device is in contact with kidney 404 when ultrasound image 602-2 is captured. This is one example of how image content may be determined and used. Other suitable ways of determining and using image content may be used in other examples.

Ultrasound images 602 are shown to the side of images 600 in FIGS. 6A and 6B for illustrative purposes. It is understood that ultrasound images 602 may be provided for display in any suitable manner as may serve a particular implementation. In certain examples, ultrasound images may be provided as an augmentation to an image of a surgical space (e.g., as an overlay over an endoscopic image of a surgical space). For example, ultrasound image 602-1 may be overlaid over a portion of image 600-2 in certain implementations so that a user (e.g., surgeon 110-1) may view the captured ultrasound imagery concurrently and in place while teleoperating non-robotic device 402. Additionally or alternatively, ultrasound images 600 may be provided for display at any other location relative to an image of a surgical space and/or by way of any other suitable display device (e.g., display monitor 114) associated with a computer-assisted surgical system.

Returning to FIG. 5, in operation 506 system 300 may determine, based on the change in image content, a contact state of the non-robotic imaging device with respect to an object, such as whether the non-robotic imaging device is in contact with the object in the surgical space. This may be accomplished in any suitable manner. Continuing with the example described above, system 300 may determine based on the amount of image content in ultrasound image 602-2 whether non-robotic device 402, for example, is not in contact with kidney 404, is partially in contact with kidney 404, is fully in contact with kidney 404, is in contact with kidney 404 within some predefined pressure range that results in capturing acceptable ultrasound images, is in contact with kidney 404 but at a contact pressure that is not sufficient to capture acceptable ultrasound imagery, or is in contact with kidney 404 with too much contact pressure to sufficiently capture ultrasound imagery. System 300 may use such information as parameters to be used to facilitate guided teleoperation of non-robotic device 402.

In certain examples, system 300 may additionally or alternatively use entropy in an image captured by a non-robotic imaging device to determine a contact state of the non-robotic imaging device with respect to an object. In such examples, the entropy may represent a statistical measure that is calculated from image pixel intensities in the captured image. System 300 may process the captured image in any suitable manner to determine the entropy in the captured image. System 300 may then use the entropy in the captured image in any suitable manner to infer the contact state of a non-robotic imaging device with respect to an object.

Based on one or more parameters of a surgical space, which may include one or more parameters of a non-robotic device and/or one or more parameters associated with an object in the surgical space, system 300 may generate guidance content associated with the non-robotic device. As used herein, “guidance content” may include any content that may be used by a computer-assisted surgical system to facilitate guided teleoperation of a non-robotic device in a surgical space. The generating of such guidance content by system 300 may include generating instructions and/or other guidance content for use by a computer-assisted surgical system, such as by generating computer-readable instructions for processing by the computer-assisted surgical system, and/or may include generating and/or accessing any suitable content to be presented by the computer-assisted surgical system (e.g., via a user interface associated with the computer-assisted surgical system). Examples of guidance content may include, but are not limited to, virtual representations of robotic instruments, virtual representations of non-robotic devices, notifications, virtual pointers, animations, instructions, audible guidance, visual guidance, haptic feedback guidance, graphical depictions of suggested paths for a non-robotic device to follow, content configured to indicate a contact state of a non-robotic device with respect to an object in the surgical space, instructions usable by the computer-assisted surgical system to provide guidance content, and/or any combination thereof. Examples of guidance content that may be generated by system 300 to be presented by a computer-assisted surgical system may include, but are not limited to, suggested paths for a robotic instrument to follow within a surgical space, content configured to indicate a contact state of a non-robotic device with respect to an object in the surgical space, and/or any other generated content that may facilitate guided teleoperation of a non-robotic device. Specific examples of guidance content are described herein.

System 300 may generate guidance content at any suitable time. For example, system 300 may generate guidance content prior to a surgical procedure, during a surgical procedure, and/or at any other suitable time.

In certain examples, system 300 may generate at least some guidance content by accessing the guidance content from a storage device (e.g., storage facility 304) associated with a computer assisted surgical system (e.g., surgical system 100). Examples of guidance content that may be accessed from a storage device may include, but are not limited to, graphical depictions of robotic instruments, non-robotic devices, and/or non-robotic devices that are engaged by (e.g., grasped by) robotic instruments, audible notifications, visual notifications, etc.

Guidance content may be generated based on any suitable parameters associated with a surgical space, such as described herein. For example, guidance content may be generated based on one or more of a procedural context associated with the surgical space, parameters of a non-robotic device (e.g., an identified type of non-robotic device, a pose of the non-robotic device, etc.), parameters of a robotic instrument (e.g., an identified type of robotic instrument, a pose of the robotic instrument, etc.), an indicated or a predicted use or operation of the non-robotic device, and/or any other suitable parameter or combination of parameters.

To illustrate an example, system 300 may access any suitable information associated with a surgical space to obtain a first parameter, a second parameter, and a third parameter associated with a non-robotic device located in the surgical space. Based on the first, second, and third parameters, system 300 may generate guidance content to facilitate teleoperation of the non-robotic device in the surgical space. For example, system may determine from the first, second, and third parameters that a visual notification in the form of a graphical overlay would be useful in facilitating teleoperation of the non-robotic device in the surgical space. Accordingly, system 300 may access the graphical overlay in any suitable manner for presentation by way of a computer-assisted surgical system. Based on the first, second, and third parameters, system 300 may also generate computer executable instructions that specify when the graphical overlay is to be provided for display, where the graphical overlay is to be provided for display, how long the graphical overlay is to be provided for display, etc. to facilitate teleoperation of the non-robotic device. Specific examples of how one or more parameters may be used to generate guidance content are described herein.

Guidance content generated by system 300 for presentation by a computer-assisted surgical system may be configured to be presented in any suitable manner. For example, in certain implementations, the guidance content may be configured to be presented by way of a user interface associated with a computer-assisted surgical system. To illustrate, system 300 may provide the guidance content for presentation by way of user control system 104 of surgical system 100 to facilitate a user, such as surgeon 110-1, teleoperating a non-robotic device. Additionally or alternatively, the guidance content may be provided for presentation by way of any other suitable user interface that may be associated with a computer-assisted surgical system. For example, guidance content may be provided to a user by way of a user interface associated with display monitor 114 of auxiliary system 106 in certain implementations.

In certain examples, system 300 may provide guidance content as visual guidance to facilitate a user (e.g., surgeon 110-1) of a computer-assisted surgical system teleoperating a non-robotic device in a surgical space. Such visual guidance may be provided in any suitable manner. For example, system 300 may instruct a computer-assisted surgical system to provide a blinking light and/or any suitable graphical object or augmented overlay for display to a user (e.g., to surgeon 110-1 by way of user control system 104) that guides the user in teleoperation of the non-robotic device in the surgical space.

Additionally or alternatively, system 300 may provide guidance content as audible guidance to facilitate a user of a computer-assisted surgical system teleoperating a non-robotic device in a surgical space. Such an audible guidance may be provided in any suitable manner. For example, an audible notification may include a “beep,” playback of an audio clip with spoken language, and/or any other suitable audible guidance.

Additionally or alternatively, system 300 may be provide guidance content as haptic feedback guidance to facilitate a user of a computer-assisted surgical system teleoperating a non-robotic device. Such haptic feedback guidance may be provided in any suitable manner. For example, system 300 may instruct a computer-assisted surgical system to cause one of the master controls of user control system 104 to vibrate to inform the user regarding where or how to move a non-robotic device in a surgical space. Various examples of guidance content or combinations of guidance content that may be provided by system 300 are described herein.

In certain implementations, guidance content generated by system 300 may facilitate a non-robotic device making contact with an object, maintaining a predefined amount of contact with the object, and/or maintaining a predefined contact angle with respect to the surface of the object. Accordingly, in such examples, the guidance content may indicate at least one of a contact pressure or a contact angle of a non-robotic device with respect to a surface of an object and/or may indicate one or more operations to be performed to obtain and/or maintain a certain contact angle and/or contact pressure (e.g., within certain ranges of contact angles and/or contact pressures) between the non-robotic device and the object.

To illustrate an example, in certain implementations, a non-robotic device may correspond to a suture needle that is grasped by a robotic instrument in a surgical space. In such examples, system 300 may obtain and analyze any suitable parameter(s) associated with the suture needle and/or the surgical space (e.g., depth data, visible light imagery, force feedback data, etc.) to determine the contact state of the suture needle with respect to tissue (e.g., whether the suture needle is in contact with tissue to be sutured, the contact pressure of the suture needle, and/or the contact angle of the suture needle with respect to the tissue). System 300 may then generate guidance content in any suitable manner to facilitate a user performing a suturing procedure with the suture needle. For example, system 300 may generate visual guidance in the form of a graphical overlay to be provided for display by way of the stereoscopic image viewer of user control system 104. Such visual guidance may instruct the user to move a robotic instrument in a specific manner to adjust the contact pressure of the suture needle with respect to the tissue to be sutured, move the robotic instrument to adjust the contact angle of the suture needle with respect to the tissue to be sutured, change a suturing position of the suture needle, and/or perform any other suitable action.

In certain examples, guidance content may facilitate teleoperation of a non-robotic imaging device to capture imagery of a surgical space. For example, based on the determination in operation 506 shown in FIG. 5, system 300 may generate any suitable guidance content to be provided by way of a computer-assisted surgical system to facilitate teleoperation of non-robotic device 402 with respect to kidney 404 to obtain ultrasound imagery. For example, system 300 may obtain a first parameter that indicates a relative position between non-robotic device 402 and kidney 404, a second parameter that indicates an attribute (e.g., a signal strength) of imagery captured by non-robotic device 402, a third parameter that indicates contact pressure between non-robotic device 402 and kidney 404, and/or any other suitable parameter to determine whether non-robotic device 402 is in contact with kidney 404. If, based on such parameters, system 300 determines that non-robotic device 402 is not in contact with kidney 404, system 300 may generate any suitable guidance content that instructs the user to move non-robotic device 402 towards kidney 404 to capture ultrasound imagery. For example, system 300 may generate a text notification to be provided for display in image 600-1 that instructs a user to move non-robotic device toward kidney 404. If system 300 determines that non-robotic device 402 is in contact with kidney 404 but not at enough contact pressure, system 300 may generate a visual notification in the form of, for example, a downwardly oriented arrow icon overlaid over image 600-2 indicating that the user needs to increase the contact pressure of non-robotic device 402 with respect to a surface of kidney 404. If system 300 determines that non-robotic device 402 is in contact with kidney 404 but at too much contact pressure, system 300 may generate a visual notification in the form of, for example, an upwardly oriented arrow icon overlaid over image 600-2 indicating that the user needs to increase the contact pressure of non-robotic device 402 with respect to a surface of kidney 404. System 300 may generate any other suitable guidance content in other implementations.

In certain examples, guidance content generated by system 300 may include a suggested path (may also referred to herein as a motion path) for a non-robotic device to follow in a surgical space while the non-robotic device is engaged by a robotic instrument in the surgical space. In certain examples, system 300 may be configured to generate a plurality of suggested paths for a non-robotic device to follow in a surgical space. For example, a first suggested path may start at a current position of a non-robotic device in a surgical space and may end at a first position on a surface of an object in the surgical space. A second suggested path may start at the first position on the surface of the object and extend to a second position on the surface of the object.

System 300 may generate guidance content in the form of a suggested path in any suitable manner. For example, in certain implementations, system 300 may automatically generate a suggested path based on one or more parameters associated with a surgical space, such as those described herein. As used herein, the expression “automatically” means that an operation (e.g., generating a suggested path) or series of operations are performed without requiring further input from a user. For example, system 300 may analyze image 400 shown in FIG. 4 and determine based on image 400, the presence of kidney 404, non-robotic device 402, and/or any other suitable parameters associated with the surgical space, that a procedural context is associated with a non-robotic imaging device capturing imagery of kidney 404. Based on such a procedural context, system 300 may automatically generate guidance content in the form of a suggested path for non-robotic device 402 to follow, without requiring that the user provide further input.

In certain alternative examples, system 300 may generate a suggested path for a non-robotic device to follow based on input provided by a user. To that end, system 300 may be configured to facilitate a user defining at least some portions of a suggested path prior to system 300 generating the suggested path. System 300 may facilitate a user providing input to define at least a part of a suggested path in any suitable manner. For example, system 300 may facilitate a user defining a first virtual pointer indicative of a start position of a suggested path and a second virtual pointer indicative of a stop point of the suggested path. System 300 may facilitate a user selecting a position of virtual pointers in a surgical space in any suitable manner. For example, a user (e.g., surgeon 110-1) may be able to move a cursor by manipulating master controls of user control system 104 to position the virtual pointers with respect to an object in a surgical space. Alternatively, a user (e.g., assistant 110-2) may define virtual pointers through any suitable input (e.g., mouse cursor input, touch input, etc.) entered by way of any suitable display (e.g., display monitor 114) associated with a computer-assisted surgical system.

To illustrate an example, FIGS. 7A and 7B show images 700 (e.g., images 700-1 and 700-2) that may be provided for display when virtual pointers are used as guidance content to facilitate defining a suggested path. As shown in FIG. 7A, image 700-1 includes a first virtual pointer 702-1 that is indicative of a start point of a suggested path and a second virtual pointer 702-2 that is indicative of an end point of a suggested path. A user may designate the position of virtual pointers 702 in any suitable manner. For example, a user may provide any suitable user input by way of master controls of user control console 104 to select the position of first virtual pointer 702-1. System 300 may use depth data associated with the surgical space to project first virtual pointer 702-1 so as to virtually appear at a depth position within the surgical space that corresponds to a surface of kidney 404. Similarly, the user may provide any suitable user input by way of master controls of user control console 104 to select the position of second virtual pointer 702-2. System 300 may use depth data associated with the surgical space to project second virtual pointer 702-2 so as to virtually appear at a depth position within the surgical space that corresponds to a surface of kidney 404.

Virtual pointers 702 may be provided for display to a user by way of a user interface in any suitable manner. For example, virtual pointers may be provided for display as overlays over a stereoscopic image displayed by user control console 104.

In the example shown in FIGS. 7A and 7B, system 300 may use virtual pointers 702, depth data associated with the surgical space, and/or any other suitable information as parameters to facilitate generating a suggested path 704 shown in FIG. 7B for non-robotic device 402 to follow. This may be accomplished in any suitable manner. For example, system 300 may use depth data to generate a smooth curve between first virtual pointer 702-1 and second virtual pointer 702-2 that follows the surface of kidney 404. System 300 may then project the generated smooth curve onto a depth map of the surgical space to generate suggested path 704. In such examples, system 300 may be configured to discard depth outliers due to, for example, specular reflection or other visual effects to maintain a smooth curve for suggested path 704. In so doing, suggested path 704 may facilitate non-robotic device 402 being moved, via teleoperation, optimally with respect to kidney 404 to, for example, perform subsurface imaging of kidney 404.

In the example shown in FIGS. 7A and 7B, virtual pointers 702 are shown as “X”s. However, it is understood that virtual pointers may be represented by other shapes, icons, graphical objects, etc. in other implementations. In addition, suggested path 704 is shown in FIG. 7B as a dotted line for illustrative purposes. It is understood that in certain implementations, guidance content provided to facilitate guided teleoperation of a non-robotic device may not include specifically displaying a suggested path to a user.

Additionally or alternatively, guidance content may include a graphical depiction of a suggested path provided for display by way of a display device associated with a computer-assisted surgical system. To illustrate an example, FIG. 8 shows an image 800 of a surgical space in which a graphical depiction 802 is provided for display by way of a user interface to indicate a suggested path (e.g., suggested path 704) generated by system 300. Graphical depiction 802 may be provided for display by a computer assisted surgical system in any suitable manner. For example, graphical depiction 802 may be provided for display as an overlay over a stereoscopic image provided for display by user control system 104 of surgical system 100.

In addition to graphical depiction 802 being provided as part of guidance content, system 300 may provide additional guidance content associated with a suggested path, such as by concurrently providing additional guidance content to facilitate a non-robotic device moving along a suggested path. For example, in certain implementations such additional guidance content may include providing a notification to the user of a computer-assisted surgical system that requests user confirmation that the suggested path indicated by, for example, graphical depiction 802 is acceptable. Such a notification may be provided to a user in any suitable manner. For example, system 300 may access an audible notification from a storage device associated with a computer-assisted surgical system. System 300 may instruct the computer-assisted surgical system to display graphical depiction 802 of the suggested path and playback an audio clip with the expression “please confirm that the suggested path is acceptable.” The user may then visually examine the suggested path represented by graphical depiction 802 to determine whether the suggested path is free of obstructions and/or is otherwise acceptable. If the user determines that the suggested path is acceptable, the user may provide any suitable response to the audio clip. For example, the user may say “yes” out loud to indicate that the suggested path represented by graphical depiction 802 is acceptable. In such an example, system 300 may use any suitable speech recognition algorithm to detect the response of the user. Additionally or alternatively, system 300 may access any suitable text notification that a computer-assisted surgical system may provide for display to a user to request user confirmation that a suggested path is acceptable.

Additionally or alternatively, guidance content provided by system 300 may include content that facilitates a user moving a non-robotic device along a suggested path. For example, in certain implementations, system 300 may be configured to provide virtual guidance to facilitate a user moving a non-robotic instrument along a suggested path. In certain examples, such virtual guidance may include system 300 providing haptic feedback guidance. Such haptic feedback guidance may be provided in any suitable manner. For example, such haptic feedback guidance may correspond to a virtual fixture such as a haptic feedback tunnel in the surgical space that is configured to guide control of the non-robotic instrument and/or the robotic instrument engaging the non-robotic instrument along a suggested path in the surgical space. With such a haptic feedback tunnel, as the user moves a non-robotic instrument along a suggested path, system 300 may provide haptic feedback in the form of vibration of the master controls of user control system 104 whenever the non-robotic instrument and/or the robotic instrument engaging the non-robotic instrument deviates from the suggested path by more than some predefined threshold amount.

In certain examples, it may be helpful for a user to visualize a non-robotic device moving along a suggested path prior to the non-robotic device moving along the suggested path. Accordingly, in certain examples, guidance content generated by system 300 in relation to a suggested path may additionally or alternatively include a simulation of the non-robotic device moving along the suggested path. In certain examples, such a simulation may include a virtual representation of a non-robotic device. As used herein, a “virtual representation of a non-robotic device” may correspond to any suitable indicator that may be used to represent a non-robotic device and/or inform a user of a position, orientation, or pose that a non-robotic device is intended to assume with respect to an object at any point along the suggested path. In certain examples, a virtual representation of a non-robotic device may also include a virtual representation of a robotic device that is engaged with (e.g., that is grasping) the non-robotic device. A virtual representation of a non-robotic device may have any suitable shape, size, and/or visual appearance as may serve a particular implementation. For example, a virtual representation of a non-robotic device may be transparent, translucent, opaque, colored, and/or patterned. In certain examples, a virtual representation of a non-robotic device may have a 3D appearance when displayed by a display device associated with a computer-assisted surgical system. Such a virtual representation of a non-robotic device may be provided for display in any suitable manner. For example, a computer-assisted surgical system may provide the virtual representation as a graphical overlay over an endoscopic view of the surgical space displayed to surgeon 110-1 by way of user control system 104.

To illustrate an example, FIG. 9 shows an image 900 of a surgical space in which robotic instruments 204-1 through 204-3 are provided in relation to kidney 404. As shown in FIG. 9, image 900 also include a virtual representation 902 of non-robotic device 402 grasped by robotic instrument 204-1. Virtual representation 902 may be provided for display by a computer-assisted surgical system in any suitable manner. In certain examples, virtual representation 902 may be provided for display as an animation that moves along the suggested path represented by graphical depiction 802. Additionally or alternatively, virtual representation 902 may depict an optimal starting pose for robotic instrument 204-1 and non-robotic device 402 to assume with respect to kidney 404 prior to moving along suggested path. In addition, as virtual representation 902 moves along the suggested path represented by graphical depiction 802, the orientation of virtual representation 902 may change so as to indicate optimal changes in orientation for robotic instrument 204-1 and non-robotic device 402 to assume with respect to kidney 404 along suggested path 802. As such, prior to a user performing a surgical procedure such as a subsurface imaging procedure, the user may be able to concurrently visualize a suggested path for a non-robotic device to follow, an optimal movement that a non-robotic instrument is suggested to take along the suggested path, and/or optimal orientations that the robotic instrument and corresponding engaged non-robotic device are suggested to assume along the suggested path, which may beneficially improve results of subsurface imaging.

In the example shown in FIG. 9, virtual representation 902 is depicted as being visually similar to the actual shapes robotic instrument 204-1 and non-robotic device 402 except that virtual representation 902 is in dashed lines. However, it is understood that virtual representation 902 may have a different visual appearance than a robotic instrument and an engaged non-robotic device in certain implementations. For example, in certain implementations, a virtual representation of a non-robotic device may be more schematic in nature so as to have a different visual appearance than the non-robotic device. For example, a virtual representation of a non-robotic device may be represented by a dotted or dashed line provided in a surgical space in certain implementations. Alternatively, a more schematic version of a virtual representation may have a generic appearance (e.g., a rod-like appearance) in certain implementations.

In certain implementations, system 300 may facilitate user teleoperation of a non-robotic device within a surgical space with various levels of autonomy. For example, in certain implementations, while a user moves a robotic instrument within a surgical space to control, for example, a grasped non-robotic device, system 300 may provide guidance content to automatically adjust a pose of the non-robotic device to improve performance of a surgical procedure. For example, system 300 may perform an image-based visual servoing operation in certain examples to automatically make adjustments to the pose of a non-robotic imaging device to improve image quality. Such an image-based visual servoing operation may help ensure that a non-robotic imaging device such as a drop-in ultrasound probe maintains a desired position and/or orientation with respect to an object in the surgical space. In certain examples, the maintaining of a desired position and/or orientation may include maintaining an amount of pressure and/or a desired contact angle with respect to an object in a surgical space (e.g., to capture adequate imagery).

System 300 may perform an image-based visual servoing operation in any suitable manner. To illustrate, FIG. 10 shows exemplary operations that may be performed by system 300 when performing an image-based visual servoing operation in certain implementations. In operation 1002, system 300 may analyze an image captured by a non-robotic imaging device of an object in a surgical space. System 300 may analyze the captured image in any suitable manner. For example, system 300 may use any suitable image processing technique to analyze the captured image.

In operation 1004, system 300 may determine whether the captured image includes an image capture deficiency. An image capture deficiency may correspond to any suboptimal attribute of an image captured by a non-robotic imaging device. For example, in implementations where a non-robotic imaging device corresponds to a drop-in ultrasound probe, a particular contact state of the non-robotic imaging device with respect to an object (e.g., tissue such as a kidney) may result in an image capture deficiency. For example, a no contact state or a partial contact state of the non-robotic imaging device with respect to an object may cause an image capture deficiency. Additionally or alternatively, too much pressure of the drop-in ultrasound probe into an object may cause an image capture deficiency. Additionally or alternatively, not enough pressure of the drop-in ultrasound probe into the object may cause an image capture deficiency. In such examples, the image content of the captured image may indicate that there is not enough contact, too much surface contact pressure, not enough surface contact pressure, or a suitable amount of surface contact pressure of the drop-in ultrasound probe with respect to the object. Accordingly, in certain implementations, system 300 may determine that there is an image capture deficiency based on an amount of image content in a captured image.

System 300 may determine whether a captured image includes an image capture deficiency in any suitable manner. For example, in instances where the image capture deficiency is associated with a partial contact state, system 300 may perform any suitable imaging processing operation to detect image velocity vectors in the captured image. Such image velocity vectors may indicate a boundary between an object (e.g., a tissue wall) and air space adjacent to the object. The larger the image velocity vectors, the less the non-robotic imaging device may be in contact with the object. As such, image velocity vectors may be used to determine whether a captured image includes image capture deficiency due to a partial contact state.

In certain implementations, a non-robotic device may require a specific angle of contact with an object in a surgical space to provide reliable measurements. For example, a non-robotic imaging device may require maintaining tissue contact along a surface normal (e.g., within a threshold tolerance of 90° with respect to the surface) with respect to a surface of an object to provide suitable subsurface imaging. In addition, maintaining tissue contact along a surface normal may provide information regarding an angular offset of a captured image with respect to subsurface anatomical structures. However, contact of a non-robotic imaging device with respect to a surface of an object at an angle other than a surface normal may result in such information not being available and, as such, may result in an image capture deficiency in certain implementations.

If the answer to operation 1004 is “NO”, system 300 may return to operation 1002 and analyze an additional image of an object captured by the non-robotic imaging device. However, if the answer in operation 1004 is “YES”, system 300 may perform an action to facilitate correcting the image capture deficiency in operation 1006.

In certain examples, the performing of the action in operation 1006 may include system 300 automatically adjusting a pose of a non-robotic device while a user teleoperates the non-robotic device. For example, in operation 1008, system 300 may automatically adjust at least one of a position or an orientation of a non-robotic device. To illustrate, when a non-robotic imaging device is in a partial contact state with respect an object in a surgical space, system 300 may perform a closed-loop feedback operation in which system 300 uses detected image velocity vectors in a captured image to automatically adjust at least one of a position or an orientation of a non-robotic imaging device. For example, system 300 may perform any suitable image processing operation to detect image velocity vectors in a captured image. Based on the image velocity vectors, system 300 may automatically move the non-robotic imaging device in a direction with respect to an object that would result in the non-robotic imaging device more fully contacting the object. After automatically moving the non-robotic imaging device, system 300 may detect additional image velocity vectors in an additional captured image and automatically move the non-robotic imaging device again based on the additional image velocity vectors to further increase the amount of contact of the non-robotic imaging device with respect to the object. System 300 may automatically repeat such operations any suitable number of times so as to minimize the image velocity vectors and ensure that the non-robotic imaging device maintains an acceptable amount of contact with respect to the object.

Additionally or alternatively, system 300 may automatically adjust at least one of an amount of contact pressure or a contact angle of a non-robotic imaging device with respect to an object (e.g., by automatically manipulating a robotic instrument engaged with the non-robotic instrument and/or by adjusting one or more operational settings of the non-robotic instrument). System 300 may be configured to continually perform such automatic adjustments while the user moves the non-robotic device within the surgical space. In certain examples, such automatic adjustments may be relatively small in nature. As such, the automatic adjustments of the pose of a non-robotic device in operation 1008 may be transparent to a user as the user teleoperates the non-robotic device in certain implementations.

Additionally or alternatively, the performing of the action in operation 1006 may include providing guidance content in the form of a notification to a user of a computer-assisted surgical system that instructs the user to take action to correct the image capture deficiency. For example, in operation 1010, system 300 may provide any suitable notification, such as those described herein, to a user that instructs the user to adjust at least one of a position or an orientation of an object in the surgical space. In certain examples, the such a notification may further instruct the user to adjust at least one of an amount of contact pressure or a contact angle of a non-robotic imaging device with respect to an object in the surgical space.

In certain examples, system 300 may be configured to provide guidance content in the form of a notification to a user to facilitate the user maintaining a specific angle of contact of a non-robotic device with an object in a surgical space. Such guidance content may be provided for display to a user in any suitable manner. To illustrate an example, FIG. 11 shows an image 1100 of a surgical space in which robotic instruments 204-1 through 204-3 are provided in relation to kidney 404. As shown in FIG. 11, graphical indicators 1102 (e.g., graphical indicators 1102-1 through 1102-5) are provided for display in addition to the suggested path represented by graphical depiction 802. Graphical indicators 1102 are configured to depict a surface normal with respect to a particular portion of the surface of kidney 404 associated with each graphical indicator 1102.

System 300 may determine the surface normal of an object in a surgical space in any suitable manner. For example, system 300 may determine the surface normal based on depth data (e.g., a depth map) associated with a surgical space. Additionally or alternatively, system 300 may determine the surface normal associated with each graphical indicator (e.g., graphical indicators 1102) based on a 3D model of an object to be imaged. For example, in FIG. 11, a 3D model of kidney 404 may be used in certain implementations to determine the surface normal of any given portion of the surface of kidney 404.

In the example shown in FIG. 11, five graphical indicators 1102 are shown. However, it is understood that any suitable number of graphical indicators may be provided for display as may serve a particular implementation. In addition, graphical indicators 1102 are shown as having an arrow shape in FIG. 11. However, graphical indicators 1102 may have any suitable shape, size, and/or configuration as may serve a particular implementation. In certain examples, graphical indicators 1102 may have a three-dimensional appearance to facilitate accurately indicating which direction the suggested contact angle extends within the surgical space with respect to the surface of kidney 404.

After system 300 generates guidance content such as described herein, system 300 may provide the guidance content to a computer-assisted surgical system (e.g., surgical system 100). This may be accomplished in any suitable manner. For example, system 300 may transmit the guidance content in any suitable manner (e.g., by way of a wired and/or a wireless connection) by way of any suitable communication interface associated with a computer-assisted surgical system. The computer-assisted surgical system may use the guidance content in any suitable manner, such as described herein, to facilitate guided teleoperation of a non-robotic device (e.g., while the non-robotic device is grasped by a robotic instrument) in a surgical space.

FIG. 12 illustrates an exemplary method for facilitating guided teleoperation of a non-robotic device in a surgical space. While FIG. 12 illustrates exemplary operations according to one embodiment, other embodiments may omit, add to, reorder, and/or modify any of the operations shown in FIG. 12. One or more of the operations shown in FIG. 12 may be performed by a system such as system 300, any components included therein, and/or any implementation thereof.

In operation 1202, a processor (e.g., a processor implementing processing facility 302) associated with a computer-assisted surgical system (e.g., surgical system 100) may obtain one or more parameters of a non-robotic device in a surgical space, the non-robotic device engaged by a computer-assisted surgical system. As described herein, in certain examples, the non-robotic device may be grasped by a robotic instrument that is attached to a computer-assisted surgical system. Operation 1202 may be performed in any of the ways described herein.

In operation 1204, the processor may generate, based on at least the one or more parameters of the non-robotic device, guidance content for use by a computer-assisted surgical system to facilitate guided teleoperation of the non-robotic device. As described herein, in certain examples, the guided teleoperation of the non-robotic device may occur while the non-robotic device is grasped by a robotic instrument. In certain alternative examples, the guided teleoperation may occur while the non-robotic device is otherwise engaged with (e.g., locked in position with respect to) the robotic instrument in any suitable manner such as described herein. Operation 1204 may be performed in any of the ways described herein.

In operation 1206, the processor may provide the guidance content to a computer-assisted surgical system. Operation 1206 may be performed in any of the ways described herein.

Although the preceding disclosure describes operations that facilitate guided teleoperation of a non-robotic device that is, for example, engaged by a robotic instrument, it is understood that system 300 may also perform various operations associated with identification of a target object in a surgical space, identification of a robotic instrument to be used to interact with a target object, and/or facilitating a robotic instrument interacting with (e.g., grasping or otherwise engaging) a target object. As used herein, a “target object” may refer to any object that may be located in a surgical space. For example, a target object may correspond to a non-robotic device located in the surgical space, a robotic instrument located within the surgical space, or any other object or instrument that may be located in a surgical space.

To that end, in certain examples, system 300 may be configured to determine whether a target object is located in a surgical space. System 300 may determine whether a target object is located in a surgical space in any suitable manner. For example, in certain implementations, system 300 may use vision-based image processing techniques (e.g., computer vision techniques) to determine whether a target object is located in a surgical space. In such examples, system 300 may be configured to use any suitable vision-based image processing technique to track and identify one or more objects and/or types of objects (e.g., robotic instruments, non-robotic devices, tissue, etc.) within a surgical space. Such vision-based image processing techniques may include system 300 using imaging device 202 to capture imagery (e.g., one or more images) of the surgical space. System 300 may use the captured imagery as input for the vision-based image processing techniques to determine information associated with the objects in the surgical space. For example, system 300 may use the captured imagery to determine, in any suitable manner, whether a target object is located in the surgical space. In addition, system 300 may use the captured imagery and any suitable vision-based image processing technique to determine the size, the shape, the pose, and/or the number of objects located in the surgical space. In certain examples, any object in the surgical space other than a robotic instrument may be considered as a candidate for being identified as a target object located in the surgical space.

Additionally or alternatively, system 300 may be configured to determine whether a target object is located in a surgical space based on a depth map of the surgical space. System 300 may be configured to use a depth map in any suitable manner. For example, system 300 may detect a difference between a current depth map of the surgical space and one or more previous depth maps of the surgical space. Based on the detected difference, system 300 may identify known object shapes, known patterns of object shapes (e.g., insertion patterns), and/or any other suitable information that may be indicative of a target object that is either located in a surgical space and/or that is in the process of being inserted in the surgical space. System 300 may use any suitable number of depth maps to determine whether a target object is located in a surgical space as may serve a particular implementation. In certain examples, system 300 may compare a sequence of previous-frame depth maps to a current-frame depth map to determine whether a target object is located in a surgical space. In certain examples, system 300 may be configured to continually monitor a depth map of a surgical space to determine in real time whether a target object is located in a surgical space.

Exemplary operations that may be performed by system 300 when using a depth map to determine whether a target object is located in a surgical space may include obtaining a depth map of a surgical space. System 300 may obtain the depth map of the surgical space in any suitable manner. For example, system 300 may use imaging device 202 to capture depth data, which system 300 may then use to generate a depth map of the surgical space in any suitable manner. Alternatively, system 300 may receive the depth map from any suitable source.

System 300 may extract a representation of an object from the depth map. A representation of an object may have any suitable format as may serve a particular implementation. For example, a representation of an object may correspond to a surface contour of an object, a volumetric reconstruction of an object (e.g., a point cloud of the object), an outer contour shape of an object, etc. System 300 may extract the representation of the object from the generated depth map in any suitable manner. For example, system 300 may subtract a previous depth map of the surgical space from the current depth map of the surgical space that includes the object. The depth map data that remains after such a subtraction may be representative of the object in the surgical space. As another example, system 300 may segment the depth map by classifying points in the depth map as being associated with particular objects or types of objects. Points that are labeled as corresponding to the object may be extracted as a representation of the object.

System 300 may compare the extracted representation of the object to a plurality of representations of known target objects. This may be accomplished in any suitable manner. For example, system 300 may access data representative of the plurality of known target objects from storage facility 304. System 300 may then compare the extracted representation to at least some of the representations included in the plurality of representations of known target objects. Based on the comparison, system 300 may use any suitable image processing technique to determine a degree of similarity between the extracted representation and at least some of the representations included in the plurality of representations of known target objects. In certain examples, system 300 may compare the extracted representation to each of the representations included in the plurality of representations of known target objects.

System 300 may identify, from the plurality of representations of known target objects, a representation of a known target object that matches the extracted representation of the object. System 300 may determine whether there is a match between the extracted representation of the object and a representation of a known target object in any suitable manner. For example, system 300 may determine that there is a match when a degree of similarity between the extracted representation and a representation of a known target object is above a predefined threshold amount. To illustrate, system 300 may determine that there is a match if the degree of similarity between the extracted representation and the representation of the known target object is above 95%. Such a percentage degree of similarity may be determined in any suitable manner.

In certain alternative implementations, system 300 may use image subtraction to determine whether there is a match between the extracted representation and a representation of a known target object. In such examples, system 300 may obtain image data that corresponds to the depth positions in the extracted representation. System 300 may also obtain image data of the representation of the known target object. System 300 may then subtract pixel values of pixels in the image data of the extracted representation from pixel values of similarly positioned pixels in the image data of the representation of the known target object. When the result of such image subtraction is zero or almost zero, the extracted representation and the representation of a known target object may be considered as being a perfect match. However, system 300 may be configured to determine that there is a match between the extracted representation and the representation of a known target object as long as the subtracted result is within some predefined threshold from zero.

Based on the identified match, system 100 may identify the object as a target object located in the surgical space.

In certain examples, the determination that a target object is located in a surgical space may include system 300 determining what type of target object is located in the surgical space. System 300 may determine the type of the target object located in a surgical space in any suitable manner. For example, system 300 may access an image of a surgical space (e.g., an image captured by imaging device 202).

System 300 may extract an image of a non-robotic device from the captured image. System 300 may extract the image in any suitable manner using any suitable image processing technique. For example, system 300 may use computer vision techniques and image segmentation to locate boundaries (lines, curves, etc.) of the non-robotic device in the captured image to determine a representation of the non-robotic device. In certain examples, such a representation may correspond to an outer contour shape of the non-robotic device and/or any other suitable representation, such as those described herein. An outer contour shape of a target object such as a non-robotic device may define a profile of a perimeter of the target object when viewed from a particular viewpoint within the surgical space. Based on the representation of the non-robotic device, system 300 may extract the image of the non-robotic device from a remainder of the captured image.

System 300 may compare a representation of the non-robotic device to a plurality of representations (e.g., a plurality of outer contour shapes) of a plurality of known non-robotic devices. Each representation included in the plurality of representations of the plurality of known non-robotic devices may represent a different type of non-robotic device. For example, a first representation included in the plurality of representations of known non-robotic devices may be representative of a first type of non-robotic device, the second representation may be representative of a second type of non-robotic device, and the third representation may be representative of a third type of non-robotic device.

System 300 may compare the representation of the non-robotic device to the plurality of representations of a plurality of known non-robotic devices system in any suitable manner. For example, system 300 may compare the representation to the first representation, the second representation, and the third representation that are each included in the plurality of representations of the plurality of known non-robotic devices. Based on the comparison, system 300 may determine, in any suitable manner, a degree of similarity between the representation of the non-robotic device and each of the first, second, and third representations.

From the plurality of representations of the plurality of known target objects, system 300 may select a representation that matches the representation of the non-robotic device. System 300 may determine that the selected representation matches the representation of the non-robotic device in any suitable manner. Continuing with the example described above, system 300 may determine that the degree of similarity between the representation of the non-robotic device and the third representation is relatively higher than the degree of similarity between the representation of the non-robotic device and the first and second representations. Accordingly, system 300 may select the third representation as matching the representation of the non-robotic device.

System 300 may determine the type of the non-robotic device based on the selected matching representation. This may be accomplished in any suitable manner. For example, continuing with the example described above, system 300 may determine that the type of the non-robotic device corresponds to the type of non-robotic device represented by the third representation included in the plurality of representations of known non-robotic devices.

In certain alternative examples, system 300 may be configured to determine that a target object is located in a surgical space based on information provided by a user of surgical system 100. For example, assistant 110-2, nurse 110-3, and/or any other individual associated with a surgical procedure may place a target object within a surgical space and then enter information, in any suitable manner, that indicates that the target object is in the surgical space. In certain examples, such information may also indicate the type of target object, an insertion location of the target object, and/or any other suitable information associated with the target object.

In certain examples, system 300 may be configured to provide a notification to a user (e.g., surgeon 110-1) indicating that a target object has been detected in a surgical space and/or providing any suitable information to the user to facilitate an intended interaction (e.g., a grasping interaction) with the target object. System 300 may provide such a notification in any suitable manner. For example, system 300 may provide a text-based notification in an interface displayed by way of the stereoscopic image viewer of user control console 104. Such a text-based notification may inform the user that a target object has been detected in the surgical space and may provide any other suitable information associated with the target object. Additionally or alternatively, system 300 may be configured to provide any suitable audible notification indicating that a target object has been detected in a surgical space. For example, an exemplary audible notification may include system 300 playing an audio clip with the expression “An ultrasound probe has been inserted.”

In certain examples, system 300 may be configured to detect an intent of a user of a computer-assisted surgical system to use a robotic instrument attached to the computer-assisted surgical system to interact with the target object while the target object is located in the surgical space. System 300 may detect the intent of the user to interact with a target object in any suitable manner. In certain examples, system 300 may provide a notification to the user prompting the user to indicate whether the user intends to interact with a target object. For example, system 300 may provide a notification, in any suitable manner, to the user indicating that a target object has been detected in the surgical space. Such a notification may also inquire, in any suitable manner, whether the user intends to interact with the target object. System 300 may then detect any suitable user input that may be provided by the user to indicate the intent of the user to interact with a target object.

For example, system 300 may detect the intent of the user to interact with a target object by detecting a voice-based command provided by a user (e.g., surgeon 110-1, assistant 110-2, etc.) of surgical system 100. System 300 may detect a voice-based command in any suitable manner using any suitable speech recognition algorithm. In certain examples, system 300 may store (e.g., through storage facility 304) one or more predefined voice-based commands that are configured to cause system 300 to determine that the user intends to interact with a target object. For example, the expressions “I want to use ultrasound,” “pick up ultrasound probe,” etc. may correspond to exemplary pre-defined voice-based commands that system 300 may be configured to use to determine the intent of the user to interact with a target object that corresponds to a drop-in ultrasound probe.

Additionally or alternatively, system 300 may detect a gesture-based command provided by a user of the computer-assisted surgical system. Such a gesture-based command may include any suitable input (that may be provided by way of any suitable user interface associated with surgical system 100. For example, system 300 may detect a gesture-based command provided by way of surgeon 110-1 manipulating master controls of user control system 104 (e.g., one or more commands that cause a robotic instrument to move toward and/or within a threshold distance of the target object). Additionally or alternatively, system 300 may detect a gesture-based command provided by way of an input (e.g., a touch input, a mouse cursor input, etc.) with respect to display monitor 114 or any other device that may be communicatively coupled to surgical system 100.

In certain examples, system 300 may detect the intent of a user to interact with a target object by detecting a gaze-based command provided by the user. Such a gaze-based command may be detected by system 300 in any suitable manner. For example, system 300 may be configured to access, in any suitable manner, images generated by an imaging device provided within the stereoscopic image viewer of user control system 104. Based on the generated images, system 300 may determine a gaze point of the user's eye by determining a positional relationship between the pupil of the user's eye and a corneal reflection caused by infrared light provided by an infrared light source within user control system 104. System 300 may then infer the gaze point of the user's eye in any suitable manner based on the determined positional relationship.

When the gaze point of the user's eye dwells on the target object for a predetermined amount of time, system 300 may determine that the user of the computer-assisted surgical system intends to interact with the target object. The predetermined amount of time may correspond to any suitable amount of time that may be used to determine the intent of the user. For example, the predetermined amount of time may correspond to three seconds in certain implementations. In such an example, whenever the user's gaze point dwells on the target object for three or more seconds, system 300 may determine that the user intends to interact with the target object.

Additionally or alternatively, system 300 may be configured to detect the intent of the user based on a procedural context associated with a surgical space. To illustrate an example, a procedural context associated with a surgical space may be associated with use of a drop-in ultrasound probe within the surgical space. In such an example, system 300 may determine that a user intends to interact with the ultrasound probe based on a captured image of the surgical space that shows the ultrasound probe being present within the surgical space. In another example, a procedural context associated with a surgical space may be associated with a suturing operation to be performed in the surgical space. In such an example, system 300 may determine a user intends to interact with a non-robotic device such as suture needle based on the combination of a suture needle being detected in the surgical space and a needle driver robotic instrument being located in the surgical space and grasping the suture needle.

System 300 may detect the intent of the user to interact with a target object at any suitable time. For example, system 300 may detect the intent of the user after system 300 determines that the target object is located in the surgical space. Alternatively, system 300 may detect the intent of the user to interact with the target object before system 300 determines that the target object is located in the surgical space.

System 300 may further be configured to determine a pose of a target object within a surgical space. System 300 may determine the pose of a target object in any suitable manner. For example, the pose of a target object may be determined based on a combination of depth data (e.g., provided in a depth map of a surgical space) and a determined orientation of the target object within in the surgical space. Exemplary ways that system 300 may determine an orientation of a target object will now be described.

In certain examples, system 300 may determine an orientation of a target object by using a 3D model of the target object. System 300 may use a 3D model of a target object in any suitable manner to facilitate determining an orientation of the target object. For example, system 300 may be configured to access an image of a target object in a surgical space (e.g., an image captured by imaging device 202). System 300 may be configured to determine a representation (e.g., an outer contour shape) of the target object from a viewpoint of the imaging device. System 300 may use any suitable image processing algorithm to determine the representation of the target object. System 300 may compare the representation of the target object to a 2D projection of a 3D model of the target object that is oriented in a known orientation. System 300 may be configured to determine a projection error between the representation of the target object and the 2D projection of the 3D model. The projection error may correspond to any quantifiable metric that is indicative of a difference between an orientation of a representation of a target object and an orientation of a 2D projection of a 3D model. The greater the projection error, the less likely that the target object is oriented in the known orientation. As such, system 300 may determine that the target object is not in the known orientation when the projection error is above a predefined threshold.

System 300 may be configured to determine whether the projection error is less than a predefined threshold. If system 300 determines that the projection error is less than the predefined threshold, system 300 may then identify the target object as being oriented in the known orientation. On the other hand, if system determines that the projection error is not less than the predefined threshold, system 300 may change the orientation of the 3D model and generate an additional 2D projection of the 3D model of the target object that is oriented in an additional known orientation. System 300 may then determine an additional projection error between the representation of the target object and the additional 2D projection of the 3D model. System 300 may then repeat an operation to determine whether the additional projection error is less than the predefined threshold. System 300 may repeat such operations until the orientation of the target object is determined.

In addition to system 300 determining the orientation of the target object, system 300 may determine the position of the target object within the surgical space. This may be accomplished in any suitable manner. For example, system 300 may use depth data and/or any other suitable data to determine the position of the target object within the surgical space. System 300 may then determine the pose of the target object within the surgical space based on the combination of the determined orientation of the target object and the determined position of the target object within the surgical space.

In certain examples, system 300 may determine a pose of a target object based on the position of the target object within the surgical space and an orientation of one or more markers provided on an outer surface of a target object. In such examples, a particular orientation of one or more markers when viewed from a particular viewpoint may be indicative of a particular orientation of the target object within the surgical space. For example, two markers may be provided on an outer surface of a non-robotic device. A first orientation of the two markers may be indicative of a first orientation of the non-robotic device, a second orientation of the two markers may be indicative of a second orientation of the non-robotic device, and a third orientation of the two markers may be indicative of a third orientation of the non-robotic device. System 300 may detect whether the two markers are in the first orientation, the second orientation, or the third orientation in any suitable manner. For example, system 300 may analyze an image of the surgical space in any suitable manner to determine the orientation of the two markers from a particular viewpoint.

In certain examples, the one or more markers may also be used to identify the type of a non-robotic device located in a surgical space. For example, a particular type of a marker, position of a marker, combination of markers, and/or configuration of a marker may indicate the type of the non-robotic device. A marker provided on an outer surface of a target object may have any suitable configuration as may serve a particular implementation.

System 300 may detect the pose of a target object within a surgical space at any suitable time. In certain examples, system 300 may detect the pose of the target object within a surgical space after detecting the intent of a user to interact with the target object. Alternatively, system 300 may detect the pose after or concurrently with the detection of the target object being located in the surgical space.

In certain examples, the pose of a target object may change during a surgical procedure performed with respect to a surgical space. When the target object changes position and/or orientation within the surgical space, system 300 may determine an updated pose of the target object in the surgical space in any suitable manner, such as described herein. In certain examples, system 300 may be configured to continually monitor and update the pose of a target object during a surgical procedure. Alternatively, system 300 may periodically determine an updated pose of a target object.

In certain examples, system 300 may determine a pose that a robotic instrument is intended to assume to interact with a target object. System 300 may determine the pose that a robotic instrument is intended to assume in any suitable manner. For example, system 300 may access a database that includes a set of candidate orientations for the robotic instrument to assume to facilitate the robotic instrument interacting with the target object. Such a database may include any suitable number of candidate orientations as may serve a particular implementation. In certain examples, the database may include a plurality of candidate orientations for each possible orientation that a target object may have in a surgical space. For example, a first orientation of a target object in a surgical space may be associated with a first candidate orientation, a second candidate orientation, and a third candidate orientation of a robotic instrument. A second orientation of a target object in the surgical space may be associated with a fourth candidate orientation, a fifth candidate orientation, and a sixth candidate orientation of the robotic instrument. In such examples, system 300 may determine the orientation of the target object in any suitable manner. System 300 may then select the corresponding candidate orientations from the database that are associated with the determined orientation of the target object as being possible orientations for a robotic instrument to assume. Such a database may be maintained by storage facility 304 and/or may be maintained by any suitable storage device accessible by system 300.

System 300 may select an orientation from the set of candidate orientations included in the database. System 300 may select the orientation in any suitable manner. For example, system 300 may analyze a current pose (e.g., orientation and position) of a target object within the surgical space. Based on the current pose of the target object, system 300 may determine that the set of candidate orientations includes a first candidate orientation, a second candidate orientation, and a third candidate orientation that the robotic instrument may assume to facilitate interacting with the target object. System 300 may then select which of the first, second, or third candidate orientations of the robotic instrument included in the database is most conducive to the robotic instrument interacting with the target object. For example, system 300 may determine, in any suitable manner, that the first candidate orientation is easier for a user to achieve (e.g., based on the current orientation of the robotic instrument in the surgical space), results in a better interaction (e.g., a stronger grasp) with the target object, and/or results in better visibility in the surgical space than the second and third candidate orientations. Accordingly, system 300 may select the first candidate orientation as the orientation to be used for the robotic instrument to interact with the target object.

System 300 may select a position for the robotic instrument to assume within the surgical space in relation to the target object. System 300 may select the position for the robotic instrument to assume in any suitable manner. For example, system 300 may analyze depth data associated with the surgical space to determine a relative pose within the surgical space of the target object and/or other objects (e.g., anatomy, other robotic instruments, etc.). Based on the depth data, the selected orientation that the robotic instrument is intended to assume, and/or any other suitable information, system 300 may select a position for the robotic instrument to assume while the robotic instrument is in the selected orientation.

System 300 may determine the pose that the robotic instrument is intended to assume based on the selected orientation and the selected position.

In certain examples, system 300 may determine a pose that a robotic instrument is intended to assume based on a type of the target object located in the surgical space. To illustrate an example, a non-robotic device may correspond to a type of target object that is best grasped from a direction that is perpendicular to a lengthwise extension of the non-robotic device. Accordingly, system 300 may select an orientation for a robotic instrument to assume that is perpendicular to the lengthwise extension of the non-robotic device. In addition, the intended use of a non-robotic device may require maintaining visualization of the surface of an object (e.g., kidney 404) in a surgical space. Accordingly, system 300 may select the orientation of the robotic instrument so as to maximize viewability of the surface of the object during a surgical procedure (e.g., while robotic instrument 204-1 grasps and moves non-robotic device 402 within the surgical space). For example, the selected orientation and selected position may be provided on a side of a robotic instrument to maximize viewability of the surface of the object.

In certain examples, system 300 may be configured to generate a set of candidate orientations for a robotic instrument to assume to facilitate the robotic instrument interacting with a target object. For example, system 300 may be configured to generate a set of candidate orientations for the robotic instrument to assume to facilitate the robotic instrument interacting with the target object. System 300 may generate the set of candidate orientations in any suitable manner. For example, in certain implementations, system 300 may use machine learning to generate the set of candidate orientations and/or perform any other operation described herein.

In certain examples, system 300 may use a supervised machine learning algorithm to generate a database of candidate orientations for a robotic instrument. In such examples, the training inputs to the supervised machine learning algorithm may include a plurality of images of a surgical space that include labeled orientations of robotic instruments in relation to target objects. System 300 may use the supervised machine learning algorithm in any suitable manner during a training phase to analyze the plurality of images of one or more surgical spaces with the labeled orientations of robotic instruments. After the training phase, system 300 may obtain, as an operational input, an image of a surgical space in which a target object is located. System 300 may use the supervised machine learning algorithm in any suitable manner (e.g., by using a deep neural network) to analyze the image of the surgical space including the target object and generate a set of candidate orientations. System 300 may determine, from the set of candidate orientations, an optimal orientation (e.g., an optimal class of orientations) for a robotic instrument to assume to interact with a target object. In certain alternative implementations, system 300 may use an unsupervised machine learning algorithm to perform any of the operations described herein. System 300 may select an orientation from the generated candidate orientations, may select a position for the robotic instrument to assume within the surgical space in relation to the target object, may select the position, and may determine the pose that the robotic instrument is intended to assume based on the selected orientation and the selected position in any suitable manner, such as described herein.

After system 300 determines a pose for a robotic instrument to assume, system 300 may facilitate the robotic instrument assuming the pose. System 300 may facilitate the robotic instrument assuming the pose in any suitable manner. For example, in certain implementations, system 300 may facilitate the robotic instrument assuming the pose by generating a motion path for a robotic instrument to follow to assume a determined pose. System 300 may generate a motion path in any suitable manner. For example, system 300 may determine, in any suitable manner, a current pose of a robotic instrument. System 300 may generate a motion path that starts at the current pose of the robotic instrument and that extends in any suitable path within the surgical space to the determined pose to be assumed by the robotic instrument. In certain examples, system 300 may generate a plurality of motion paths for a robotic instrument to follow to assume the determined pose. System 300 may then select an optimal motion path included in the plurality of motion paths for the robotic instrument to follow.

System 300 may leverage any suitable information associated with a surgical space to facilitate selecting an optimal motion path for a robotic instrument to follow. For example, system 300 may take into consideration a configuration of a computer-assisted surgical system, kinematic constraints of one or more manipulator arms of the computer-assisted surgical system, environmental constraints of a surgical space, and/or any other suitable information.

In certain examples, system 300 may select an optimal motion path based on a collision factor associated with a surgical space. A collision factor may represent any aspect associated with a surgical space that may affect how feasible it is for a robotic instrument to travel unimpeded along a candidate motion path. For example, a collision factor may include information associated with a position of anatomy with respect the motion path, information associated with a position of another robotic instrument and/or another object with respect to the motion path, etc. System 300 may determine that a particular motion path would result in a robotic instrument undesirably contacting anatomy and/or another object (e.g., another robotic instrument). Accordingly, system 300 may determine such a motion path is undesirable based on such collision factors.

Additionally or alternatively, system 300 may select an optimal motion path based on an economy of motion factor of a robotic instrument. An economy of motion factor may represent any aspect that defines how much a robotic instrument is to move in the surgical space to assume a given pose. For example, an economy of motion factor may include a distance that a robotic instrument is to travel to interact with a target object and/or an amount an orientation of a robotic instrument is to change to assume a particular orientation. For example, a first candidate motion path may result in the robotic instrument traversing a first distance across the surgical space and a second candidate motion path may result in the robotic instrument traversing a second distance across the surgical space. The first distance may be greater than the second distance. As such, system 300 may determine that the second candidate motion path is preferable to the first candidate motion path.

Additionally or alternatively, system 300 may select an optimal motion path based on a field of view factor of the surgical space. A field of view factor may be indicative of how much of a given motion path is viewable within a field of view of the surgical space at a given time. In such examples, a first candidate motion path that is fully within a current field of view of the surgical space may be favored over a second candidate motion path that requires a change of the field of view to view all or part of the second candidate motion path.

In certain examples, system 300 may determine that there is an obstruction in a motion path. Such an obstruction may correspond to any object that may block a robotic instrument from following a motion path. For example, an obstruction may include anatomy, another robotic instrument, and/or any other object in the surgical space. System 300 may determine that there is an obstruction in any suitable manner. For example, system 300 may determine that there is an obstruction by analyzing a depth map of the surgical space, kinematics associated with one or more robotic instruments in the surgical space, and/or any other suitable information.

If system 300 determines that there is an obstruction in a motion path, system 300 may perform an operation to facilitate removal of the obstruction from the motion path. For example, system 300 may instruct a user (e.g., surgeon 110-1), in any suitable manner, to move a robotic instrument to a different location within the surgical space that does not obstruct the motion path. Alternatively, system 300 may instruct an additional user (e.g., assistant 110-2) to remove a robotic instrument from the surgical space.

In certain examples, system 300 may automatically perform an operation to remove an obstruction from a motion path. For example, system 300 may automatically reposition a robotic instrument within the surgical space such that the robotic instrument no longer obstructs a motion path.

In certain examples, system 300 may establish a no-fly zone within a surgical space. Such a no-fly zone may correspond to an area of the surgical space where a robotic instrument is not allowed to travel. System 300 may establish such a no-fly zone due to visibility restrictions, obstructions due to other robotic instruments, obstructions due to anatomy, and/or for any other suitable reason. In such examples, system 300 may take into consideration the no-fly zone when determining an optimal motion path for a robotic instrument to follow to assume a pose.

In implementations where system 300 generates a motion path for the robotic instrument to follow, system 300 may facilitate a robotic instrument automatically following the generated motion path to assume a pose. In such examples, system 300 may direct a computer-assisted surgical system (e.g., system 100) to automatically move the robotic instrument along the motion path without requiring input from the user.

In certain implementations, system 300 may facilitate the robotic instrument automatically following the motion path with various levels of autonomy. For example, in certain implementations, system 300 may direct a computer-assisted surgical system to which the robotic instrument is attached to automatically move robotic instrument along the motion path and assume the identified pose. Alternatively, system 300 may direct the computer-assisted surgical system to which the robotic instrument is attached to automatically move the robotic instrument along the motion path to a vicinity of the determined pose. Once the robotic instrument is in the vicinity of the determined pose, a user (e.g., surgeon 110-1) may then assume manual control and fine positioning of the robotic instrument (e.g., by using master controls of user control system 104) to adjust the position and orientation of the robotic instrument such that the robotic instrument assumes the determined pose.

In examples where system 300 automatically controls a robotic instrument, system 300 may automatically cause the robotic instrument to assume an orientation associated with a determined pose at any suitable time. For example, system 300 may cause the robotic instrument to first assume the orientation associated with the determined pose and then automatically follow the motion path to a position associated with the determined pose. Alternatively, system 300 may automatically cause the robotic instrument to follow the motion path and then assume the orientation associated with the determined pose upon the robotic instrument reaching the position associated with the determined pose. Alternatively, system 300 may cause the robotic instrument to assume the orientation associated with the determined pose while the robotic instrument is following the motion path.

In certain alternative implementations, system 300 may be configured to facilitate a user (e.g., surgeon 110-1) of a computer-assisted surgical system to which the robotic instrument is attached moving the robotic instrument along the motion path. System 300 may facilitate the user moving the robotic instrument along the motion path in any suitable manner. For example, in certain implementations, system 300 may be configured to provide virtual guidance to facilitate a user moving a robotic instrument along a motion path. In certain examples, such virtual guidance may include system 300 providing haptic feedback guidance in any suitable manner, such as described herein, to facilitate a user moving a robotic instrument along a motion path.

Additionally or alternatively, system 300 may be configured to provide audible guidance to facilitate a user moving a robotic instrument along a motion path. Such audible guidance may be provided in any suitable manner. For example, as the user moves a robotic instrument so as to follow a motion path, system 300 may provide audible guidance in the form of a “beep” noise or any other suitable noise whenever the user deviates from the motion path by more than some predefined threshold amount.

In certain examples, system 300 may generate one or more waypoints along a motion path to facilitate visualization of the motion path. System 300 may generate any suitable number of waypoints along a motion path as may serve a particular implementation. In certain examples, such waypoints may be provided for display to a user instead of or as part of a graphical depiction of a motion path. Such waypoints may have any suitable size and/or shape (e.g., circle, square, triangle, etc.) as may serve a particular implementation.

In certain examples, waypoints such as those described herein may be provided by system 300 as part of a supervised autonomous movement of the robotic instrument along the motion path. For example, system 300 may facilitate a user confirming that a motion path is acceptable at each waypoint provided along a motion path as a computer assisted surgical system autonomously moves the robotic instrument along the motion path. In so doing, system 300 may receive real time confirmation from the user that a motion path is acceptable as a robotic instrument moves past each waypoint along the motion path. If there is a change in the surgical space that would affect the motion path (e.g., an obstruction is introduced after system 300 generates the motion path) as the robotic instrument moves along the motion path, system 300 may perform any suitable operation with respect to the change in the surgical space. In certain examples, system 300 may provide an augmented preview of a representation of the robotic instrument moving along the motion path to facilitate a user confirming that the motion path is acceptable.

In certain examples, system 300 may be configured to provide a notification to a user when a robotic instrument assumes a determined pose. Such a notification may be provided in any suitable manner. For example, system 300 may be configured to provide a visual notification, an audible notification, and/or a haptic feedback notification to a user when a robotic instrument assumes the determined pose. To illustrate an example, a user (e.g., surgeon 110-1) may control a robotic instrument so as to follow the motion path represented by a graphical depiction. At any suitable time during or after the movement of the robotic instrument to a position of the representation, the user may rotate the robotic instrument so that the robotic instrument assumes the orientation associated with the representation. When the position and orientation of the robotic instrument matches or is within some predefined threshold of the position and orientation of the representation, system 300 may provide, for example, an audio tone, a change the visual appearance (e.g., a change in color, pattern, etc.) of the representation, and/or haptic feedback in the form of vibration through the master controls of user controls system 104 to inform the user that the robotic instrument has assumed the pose and is ready to interact with (e.g., grasp) a non-robotic device.

In certain examples, a target object such as a non-robotic device may include a protrusion provided on an outer surface of thereof to facilitate a robotic instrument grasping the target object. In examples were a protrusion is provided on an outer surface of a non-robotic device, system 300 may take into consideration a pose of the protrusion when determining the pose that a robotic instrument will assume to interact with the non-robotic device. Any suitable number of protrusions may be provided on an outer surface of a non-robotic device as may serve a particular implementation. For example, in certain implementations, two or more protrusions may be provided an outer surface of a non-robotic device. In such examples, a robotic instrument may be configured to grasp any one of the multiple protrusions to facilitate moving the non-robotic device in the surgical space. In addition, a protrusion of a non-robotic device may have any suitable size and/or configuration to facilitate a robotic instrument attached to a computer-assisted surgical system grasping the non-robotic device.

In some examples, a non-transitory computer-readable medium storing computer-readable instructions may be provided in accordance with the principles described herein. The instructions, when executed by a processor of a computing device, may direct the processor and/or computing device to perform one or more operations, including one or more of the operations described herein. Such instructions may be stored and/or transmitted using any of a variety of known computer-readable media.

A non-transitory computer-readable medium as referred to herein may include any non-transitory storage medium that participates in providing data (e.g., instructions) that may be read and/or executed by a computing device (e.g., by a processor of a computing device). For example, a non-transitory computer-readable medium may include, but is not limited to, any combination of non-volatile storage media and/or volatile storage media. Exemplary non-volatile storage media include, but are not limited to, read-only memory, flash memory, a solid-state drive, a magnetic storage device (e.g. a hard disk, a floppy disk, magnetic tape, etc.), ferroelectric random-access memory (“RAM”), and an optical disc (e.g., a compact disc, a digital video disc, a Blu-ray disc, etc.). Exemplary volatile storage media include, but are not limited to, RAM (e.g., dynamic RAM).

FIG. 13 illustrates an exemplary computing device 1300 that may be specifically configured to perform one or more of the processes described herein. As shown in FIG. 13, computing device 1300 may include a communication interface 1302, a processor 1304, a storage device 1306, and an input/output (“I/O”) module 1308 communicatively connected one to another via a communication infrastructure 1310. While an exemplary computing device 1300 is shown in FIG. 13, the components illustrated in FIG. 13 are not intended to be limiting. Additional or alternative components may be used in other embodiments. Components of computing device 1300 shown in FIG. 13 will now be described in additional detail.

Communication interface 1302 may be configured to communicate with one or more computing devices. Examples of communication interface 1302 include, without limitation, a wired network interface (such as a network interface card), a wireless network interface (such as a wireless network interface card), a modem, an audio/video connection, and any other suitable interface.

Processor 1304 generally represents any type or form of processing unit capable of processing data and/or interpreting, executing, and/or directing execution of one or more of the instructions, processes, and/or operations described herein. Processor 1304 may perform operations by executing computer-executable instructions 1312 (e.g., an application, software, code, and/or other executable data instance) stored in storage device 1306.

Storage device 1306 may include one or more data storage media, devices, or configurations and may employ any type, form, and combination of data storage media and/or device. For example, storage device 1306 may include, but is not limited to, any combination of the non-volatile media and/or volatile media described herein. Electronic data, including data described herein, may be temporarily and/or permanently stored in storage device 1306. For example, data representative of computer-executable instructions 1312 configured to direct processor 1304 to perform any of the operations described herein may be stored within storage device 1306. In some examples, data may be arranged in one or more databases residing within storage device 1306.

I/O module 1308 may include one or more I/O modules configured to receive user input and provide user output. One or more I/O modules may be used to receive input for a single virtual experience. I/O module 1308 may include any hardware, firmware, software, or combination thereof supportive of input and output capabilities. For example, I/O module 1308 may include hardware and/or software for capturing user input, including, but not limited to, a keyboard or keypad, a touchscreen component (e.g., touchscreen display), a receiver (e.g., an RF or infrared receiver), motion sensors, and/or one or more input buttons.

I/O module 1308 may include one or more devices for presenting output to a user, including, but not limited to, a graphics engine, a display (e.g., a display screen), one or more output drivers (e.g., display drivers), one or more audio speakers, and one or more audio drivers. In certain embodiments, I/O module 1308 is configured to provide graphical data to a display for presentation to a user. The graphical data may be representative of one or more graphical user interfaces and/or any other graphical content as may serve a particular implementation.

In some examples, any of the systems, computing devices, and/or other components described herein may be implemented by computing device 1300. For example, storage facility 304 may be implemented by storage device 1306, and processing facility 302 may be implemented by processor 1304.

In the preceding description, various exemplary embodiments have been described with reference to the accompanying drawings. It will, however, be evident that various modifications and changes may be made thereto, and additional embodiments may be implemented, without departing from the scope of the invention as set forth in the claims that follow. For example, certain features of one embodiment described herein may be combined with or substituted for features of another embodiment described herein. The description and drawings are accordingly to be regarded in an illustrative rather than a restrictive sense.

Claims

1. A system comprising:

a memory storing instructions; and
a processor communicatively coupled to the memory and configured to execute the instructions to: obtain one or more parameters of a non-robotic device in a surgical space, the non-robotic device engaged by a computer-assisted surgical system; generate, based on at least the one or more parameters of the non-robotic device, guidance content for use by the computer-assisted surgical system to facilitate guided teleoperation of the non-robotic device, the guidance content including an instruction configured to be presented to a user of the computer-assisted surgical system to guide the user in moving the non-robotic device in the surgical space; and provide the guidance content to the computer-assisted surgical system.

2. The system of claim 1, wherein:

the non-robotic device is engaged by a robotic instrument attached to the computer-assisted surgical system; and
the generating of the guidance content is based on at least one of a procedural context associated with the surgical space, an identification of the non-robotic device, or an identification of the robotic instrument.

3. The system of claim 1, wherein:

the guidance content is configured to be provided by way of a user interface of the computer-assisted surgical system to facilitate the guided teleoperation of the non-robotic device; and
the guidance content further includes at least one of visual guidance, audible guidance, or haptic feedback guidance to facilitate a user of the computer-assisted surgical system teleoperating the non-robotic device in the surgical space.

4. The system of claim 1, wherein:

the non-robotic device is a non-robotic imaging device configured to capture imagery of an object in the surgical space; and
the guidance content is configured to facilitate teleoperation of the non-robotic imaging device with respect to the object in the surgical space.

5. The system of claim 4, wherein:

the non-robotic imaging device is configured to contact a surface of the object to capture the imagery of the object; and
the guidance content indicates a state of contact of the non-robotic imaging device with the surface of the object.

6. The system of claim 5, wherein the generating of the guidance content includes:

detecting a change in image content in an image captured by the non-robotic imaging device; and
determining that the non-robotic imaging device is in contact with the object based on the detected change in image content in the image.

7. The system of claim 1, wherein:

the non-robotic device is configured to contact a surface of an object in the surgical space; and
the guidance content indicates at least one of a contact pressure or a contact angle of the non-robotic device with respect to the surface of the object.

8. The system of claim 1, wherein:

the non-robotic device is engaged by a robotic instrument attached to the computer-assisted surgical system;
the generating of the guidance content includes generating a suggested path for the non-robotic device to follow in the surgical space while the non-robotic device is engaged by the robotic instrument; and
the guidance content indicates the suggested path.

9. The system of claim 8, wherein the generating of the suggested path includes:

facilitating a user of the computer-assisted surgical system defining a first virtual pointer indicative of a start position of the suggested path and a second virtual pointer indicative of a stop point of the suggested path; and
generating the suggested path for the non-robotic device to follow based on the first virtual pointer and the second virtual pointer.

10. The system of claim 9, wherein:

the guidance content includes the first virtual pointer and the second virtual pointer provided for display by way of a display device associated with the computer-assisted surgical system such that the first virtual pointer and the second virtual pointer are virtually positioned on a surface of an object in the surgical space; and
the suggested path is a curved path that follows the surface of the object.

11. The system of claim 8, wherein the generating of the suggested path includes automatically generating the suggested path based on a procedural context associated with the surgical space.

12. The system of claim 8, wherein the guidance content further includes a graphical depiction of the suggested path provided for display by way of a display device associated with the computer-assisted surgical system.

13. The system of claim 12, wherein the graphical depiction of the suggested path includes a simulation of the non-robotic device moving along the suggested path.

14. The system of claim 12, wherein the guidance content further includes a notification provided by the computer-assisted surgical system to a user of the computer-assisted surgical system, the notification requesting user confirmation that the suggested path indicated by the graphical depiction is acceptable.

15. A system comprising:

a memory storing instructions; and
a processor communicatively coupled to the memory and configured to execute the instructions to: obtain one or more parameters of an object in a surgical space; obtain one or more parameters of a non-robotic imaging device in a surgical space, the non-robotic device engaged by a computer-assisted surgical system; generate, based on the one or more parameters of the object and the one or more parameters of the non-robotic imaging device, guidance content for use by the computer-assisted surgical system to facilitate guided teleoperation of the non-robotic imaging device to capture imagery of the object in the surgical space, the guidance content including an instruction configured to be presented to a user of the computer-assisted surgical system to guide the user in moving the non-robotic device in the surgical space; and provide the guidance content to the computer-assisted surgical system.

16. A method comprising:

obtaining, by a processor associated with a computer-assisted surgical system, one or more parameters of a non-robotic device in a surgical space, the non-robotic device engaged by a computer-assisted surgical system;
generating, by the processor and based on at least the one or more parameters of the non-robotic device, guidance content for use by the computer-assisted surgical system to facilitate guided teleoperation of the non-robotic device, the guidance content including an instruction configured to be presented to a user of the computer-assisted surgical system to guide the user in moving the non-robotic device in the surgical space; and
providing, by the processor, the guidance content to the computer-assisted surgical system.

17. The method of claim 16, wherein:

the non-robotic device is a non-robotic imaging device configured to capture imagery of an object in the surgical space; and
the guidance content is configured to facilitate a user of the computer-assisted surgical system maintaining at least one of a position or an orientation of the non-robotic imaging device with respect to the object as the non-robotic imaging device moves along a surface of the object.

18. The method of claim 17, wherein:

the generating of the guidance content includes: analyzing an image captured by the non-robotic imaging device while the non-robotic imaging device is used to capture the imagery of the object; and determining, based on the analyzing of the image captured by the non-robotic imaging device, that the image includes an image capture deficiency; and
the method further includes performing, by the processor in response to the determining that the image includes the image capture deficiency, an action to facilitate correcting the image capture deficiency.

19. The method of claim 18, wherein the performing of the action to facilitate correcting the image capture deficiency includes automatically adjusting at least one of an amount of contact pressure or a contact angle of the non-robotic imaging device with respect to the object while the user of the computer-assisted surgical system uses the non-robotic imaging device to capture the imagery of the object.

20. The method of claim 16, wherein:

the non-robotic device is engaged by a robotic instrument attached to the computer-assisted surgical system;
the generating of the guidance content includes generating a suggested path for the non-robotic device to follow in the surgical space while the non-robotic device is engaged by the robotic instrument; and
the guidance content indicates the suggested path.
Patent History
Publication number: 20230112592
Type: Application
Filed: Dec 14, 2020
Publication Date: Apr 13, 2023
Inventors: Azad Shademan (Campbell, CA), Mahdi Azizian (San Jose, CA), Wen Pei Liu (San Jose, CA), A. Jonathan McLeod (Sunnyvale, CA), Daniel Proksch (San Jose, CA), Pourya Shirazian (Menlo Park, CA)
Application Number: 17/785,265
Classifications
International Classification: A61B 34/35 (20060101); A61B 34/00 (20060101); A61B 90/00 (20060101);