Tool Having Multiple Feedback Devices
Disclosed herein are systems and methods for managing how signals, which are sensed on a tool, are presented to an operator of the tool. According to one example of a method for managing signals, the method comprises allowing an operator to manipulate a tool having a plurality of output mechanisms mounted thereon. The method also includes sensing a property of an object located near or adjacent to a distal portion of the tool and processing the sensed property to obtain one or more output signals. Furthermore, the method includes applying the one or more output signals to one or more of the output mechanisms.
Latest Immersion Corporation Patents:
- Methods and systems for decoding and rendering a haptic effect associated with a 3D environment
- Systems, devices, and methods for providing actuator braking
- Systems and Methods for Controlling a Multi-Actuator System to Generate a Localized Haptic Effect
- Haptic effect encoding and rendering system
- Apparatus and methods for localizing one or more effects in a haptic interface
The embodiments of the present disclosure generally relate to hand tools and more particularly relate to sensing a property of an object in the vicinity of a hand tool and displaying the property in one or more output modes.
BACKGROUNDAs opposed to open surgery in which a surgeon cuts a relatively large incision in the skin of a patient for accessing internal organs, minimally invasive surgical procedures are performed by making relatively small incisions and then inserting tools through the incisions to access the organs. Minimally invasive surgery usually results in shorter hospitalization times, reduced therapy requirements, less pain, less scarring, and fewer complications.
During minimally invasive surgery, the surgeon can introduce a miniature camera through an incision. The camera transmits images to a visual display, allowing the surgeon to see the internal organs and tissues and to see the effect of other minimally invasive tools on the organs and tissues. In this way, the surgeon is able to perform laparoscopic surgery, dissection, cauterization, endoscopy, telesurgery, etc. Compared to open surgery, however, minimally invasive surgery can present limitations regarding the surgeon's ability to see and feel the patient's organs and tissues.
SUMMARYThe present disclosure describes a number of embodiments of systems and methods for managing how sensed signals are provided to an operator of a tool, which includes a sensor for sensing the signals. In one embodiment, for example, a method for managing signals includes allowing an operator to manipulate a tool having a plurality of output mechanisms mounted thereon. The method also includes sensing a property of an object located near or adjacent to a distal portion of the tool and processing the sensed property to obtain one or more output signals. The method also includes applying the one or more output signals to one or more of the output mechanisms.
The embodiments described in the present disclosure may include additional features and advantages, which may not necessarily be expressly disclosed herein but will be apparent to one of ordinary skill in the art upon examination of the following detailed description and accompanying drawings. It is intended that these additional features and advantages be included and encompassed within the present disclosure.
The components of the following figures are illustrated to emphasize the general principles of the present disclosure and are not necessarily drawn to scale. Reference characters designating corresponding components are repeated as necessary throughout the figures for the sake of consistency and clarity.
Although minimally invasive surgical procedures involving small incisions include many advantages over open surgery, minimally invasive surgery can still create challenges to a surgeon. For example, the surgeon must typically rely on a camera to view the patient's internal organs and see how the movement and operation of the tools affects the organs. To enhance the surgeon's experience, feedback can be provided to the surgeon to communicate information about how the body of the patient reacts to the tools. According to the teaching of the present disclosure, output can be provided to the surgeon in multiple ways, e.g., visually, audibly, tactilely, etc. Information regarding the status of feedback devices can also be provided to the surgeon as well.
The present disclosure describes embodiments that include any type of tools that can be manipulated by an operator. More particularly, the tools described in the present disclosure include a handle portion that mechanically controls a distal portion of the tool. Mounted on the distal portion are one or more sensors configured to sense a parameter of an object that interacts with the tool. The sensed signals can be processed to obtain stimulation signals designed to be provided to one or more output mechanisms, such as, for example, haptic actuation devices, vibrotactile feedback devices, kinesthetic feedback devices, visual displays, speakers or other audio devices, etc.
Although many of the examples described in the embodiments of the present disclosure relate to surgical tools, such as minimally invasive surgical tools, it should be understood that the present disclosure also encompasses other types of tools as well. In addition, although many of examples herein relate to surgical patients and how the organs and tissues of the patient interact with the surgical tools, it should also be understood that the present disclosure may also refer to other objects that are intended to interact with or react to the operations of the respective tools. Other features and advantages will be apparent to one of ordinary skill in the art upon reading and understanding the general principles of the present disclosure and are intended to be included herein.
According to the embodiment of
According to the embodiment of
Handle portion 12 also includes a processing device 24, haptic actuator 26, and a speaker 28, which are shown through a cut out in handle portion 12. Processing device 24, haptic actuator 26, and speaker 28 can be mounted on a printed circuit board (not shown), which includes circuitry for electrically coupling the components. The circuitry on the printed circuit board may include any suitable amplification and attenuation type circuitry, power sources for providing and regulating power to each components, and other circuitry for proper operation of processing device 24 and other components as would be understood by one of ordinary skill in the art. In some embodiments, haptic actuator 26 can be implemented as an inertial actuator to provide vibrotactile feedback to the operator. Haptic actuator 26 may include an eccentric rotating mass (“ERM”) actuator, linear resonant actuator (“LRA”), piezoelectric actuator, shape memory alloy, electro-active polymer, or other suitable type of actuating device.
In this embodiment of
The different output mechanisms may be used to cooperatively communicate a single property. In some embodiments, however, they can represent distinct components of the processed sensor signals or other parameters. In some cases, one or more of the output mechanisms may be configured to provide both sensor signal information and adjustment level information, depending on the particular application.
Furthermore, buttons 22 can be used to adjust the level, intensity, or amplitude at which the output mechanisms provide the respective output. For example, the operator may wish to disable haptic actuator 26, but enable speaker 28 and have its volume turned down low.
By manipulating handle 12, an operator can insert distal portion 16 into the abdomen of the patient and control tip 18 of distal portion 16. When distal portion 16 is inserted, the surgeon can further manipulate handle 12 to control the location and orientation of tip 18 such that sensing device 20 is able to contact certain regions of the patient. Sensing device 20 may include one or more sensors each configured to measure or test any desired parameter of the patient, such as pulse, for example. In some embodiments in which sensing device 20 does not necessarily need to contact a particular region of the patient, tip 18 can be controlled to position sensing device 20 to accomplish certain contactless sensing functions.
Sensing device 20 can be configured to sense any suitable property of the object under test. For instance, sensing device 20 can be configured as pressure sensors using resistive or capacitive pressure sensing technologies. Alternatively, sensing device 20 can include strain gauges, piezoelectric sensors, stiffness sensors, etc. As strain gauges, sensing device 20 can provide additional information about contact force to finely tune a generally coarse measurement of force. As piezoelectric sensors, sensing devices 20 can generate ultrasound signals that reflect off portions of the object. In this case, echo signals can be detected by sensing device 20 to determine the location of objects. Sensing device 20 can also be configured as stiffness sensors that can detect nodules, e.g., tumors, or other stiff regions.
The features illustrated and described with respect to
Level meter 46 may be an interactive control surface allowing the user to adjust the properties of the output mechanisms. Level meter 46 can receive touch information from the user to make adjustments. Also, level meter 46 can display status information of one or more output mechanisms mounted on surgical tool 40. In this regard, the status information can include whether a respective output mechanism is enable or disabled, the level of intensity, strength, or magnitude of the signal supplied to the respective output mechanism, or other parameters of the respective output mechanisms. Regarding parameters with respect to visual display devices, for example, certain picture quality parameters can be adjusted. Regarding parameters with respect to audio devices, for example, frequency, fade time, and other auditory parameters can be adjusted. Regarding parameters with respect to haptic actuating devices, for example, frequency, intensity, and other various haptic parameters can be adjusted.
LCD screen 56 can include a touchscreen, which can be configured to present information, e.g., visual information, to the operator. Also, the touchscreen can also be configured to sense when the operator presses certain portions of the touchscreen. In this way, the touchscreen can act as a touchable user interface with graphical presentation capabilities. In some embodiments, LCD screen 56 may be designed with a surface that changes shape or size based on signals being sensed. For example, the surface of LCD screen 56 may be able to adjust its topology to provide an indication of the topology or other feature of the object being sensed.
LCD screen 56 can include supplemental information, which may depend on the context of the particular surgical procedure being performed. Information may include, for example, pre-operative information, intra-operative information, radiology information, etc. LCD screen 56 may include a graphical user device that enables the surgeon to select different feedback profiles, adjust sensor behavior, modify supplemental information, etc.
LCD screen 56 may also include a control slider device 58, which can be designed to allow the operator to make adjustments to the image or contour information provided on LCD screen 56. In other embodiments, control slider device 58 may instead be configured using other types of control devices, such as buttons, switches, etc., for controlling image information.
In the embodiment of
Display device 66 can provide indication of particular parameters by the use of bar indicators 68, where the length of the bar of each respective bar indicator 68 represents an intensity or level of the respective parameter. Bar indicators 68 of display device 66 can be configured to provide output information both visually and haptically. For example, since display device 66 is positioned on the side of handle portion 62, the operator may be able to look at the side of surgical tool 60 during use to see display device 66. In addition, the index finger of the operator can be placed on the surface of display device 66 to haptically sense display device 66. In this respect, bar indicator 68 includes any suitable haptic output device, shape changing device, etc., to communicate the particular information to the operator using the sense of touch. Therefore, the operator can feel the length of bar indicators 68 to receive the needed feedback signals as needed, even without visually observing display device 66.
In this embodiment, vibrotactile, haptic, kinesthetic, and/or resistive feedback can be incorporated into rotary device 76 to indicate directly certain information through rotary device 76 itself. According to one example, a vibrotactile actuator may vibrate rotary device 76 at an amplitude based on the stiffness of the region of the patient being sensed. In another example, a sensor can be mounted on a perpendicular face of the shaft axis or distal end and oriented toward the side. Using the roll control, the sensor sweeps around the axis to observe the surrounding tissue and identify vasculature, tumor masses, etc., using visual sensor, stiffness sensor, or other suitable sensors.
Haptic actuating devices 86 may be able to sense pressure applied by the user. In response to the pressure, haptic actuating devices 86 may be configured to provide an alert that the user may be squeezing handle portion 82 too hard or too soft. Haptic actuating devices 86 may also be configured to communicate sensed information to the user.
The sensed information from each of the one or more sensors 96 is communicated to processing device 98, which is configured to process the information according to specific algorithms and operator selections. Processing device 98, for example, may correspond to processing device 24 shown in
Processing device 98 may be a general-purpose or specific-purpose processor or microcontroller for processing the signals detected by sensor 96. In some embodiments, processing device 98 may include a plurality of processors for performing different functions with respect to system 94. In some embodiments, processing device 98 may be associated with a memory device (not shown) for storing data and/or instructions. In this regard, the memory may include one or more internally fixed storage units, removable storage units, and/or remotely accessible storage units, and the various storage units may include any combination of volatile memory and non-volatile memory. Logical instructions, commands, and/or code can be implemented in software, firmware, or both, and stored in memory. In this respect, the logic code may be implemented as one or more computer programs that can be executed by processing device 34.
In other embodiments, logical instructions, commands, and/or code can be implemented in hardware and incorporated in processing device 98 using discrete logic circuitry, an application specific integrated circuit (“ASIC”), a programmable gate array (“PGA”), a field programmable gate array (“FPGA”), etc., or any combination thereof. In yet other embodiments, logical instructions, commands, and/or code can be implemented in both hardware in processing device 98 and software/firmware stored in the memory.
Each output device represented by haptic actuator 100, visual display 102, and audio device 104 may include one or more of the respective output devices in any suitable form for providing haptic, visual, or audio outputs to the operator. Also, some output devices may be capable of providing multiple modes of communication in any combination. The output mechanisms may include any number of feedback mechanisms in any number of modes for providing any type of visual, audible, and/or tactile output to the operator. In the embodiments regarding surgical tools, the output mechanisms may be set up to provide feedback to the surgeon according to the surgeon's preferences. With feedback tailored to the surgeon's desires, the tool can provide feedback to supplement the operator experience for better operation and performance.
As indicated in block 112, one or more properties of an object are sensed at a distal end of the tool. Particularly, the property or properties may be sensed by one or more sensing devices. The sensors may be positioned on or near the controlled portion of the tool. According to block 114, the sensed properties are processed to obtain output signals. Based on the properties sensed and the types of output mechanisms incorporated in tool, output signals can be obtained for each particular output mechanism.
As indicated in block 116, operator selected settings are retrieved. In particular, the settings are selected to define how the output mechanisms are to be used in response to sensed signal. For example, the operator selected settings may include whether each respective output mechanism is enable or disabled, or turned on or off. Also, the settings may include level adjustment for factors associated with the different types of modes of communication, such as a haptic mode, visual mode, auditory mode, etc.
As indicated in block 118, the output signals are provided to the one or more output mechanisms. Thus, the sensed signals are communicated to the operator in one or more different output modes, depending on the modes selected by the operator. The outputs may be haptic outputs, vibrotactile effect outputs, visual outputs, auditory outputs, or any combination of these or other outputs.
It should be understood that the routines, steps, processes, or operations described herein may represent any module or code sequence that can be implemented in software or firmware. In this regard, these modules and code sequences can include commands or instructions for executing the specific logical routines, steps, processes, or operations within physical components. It should further be understood that two or more of the routines, steps, processes, and/or operations described herein may be executed substantially simultaneously or in a different order than explicitly described, as would be understood by one of ordinary skill in the art.
The embodiments described herein represent a number of possible implementations and examples and are not intended necessarily to limit the present disclosure to any specific embodiments. Instead, various modifications can be made to these embodiments as would be understood by one of ordinary skill in the art. Any such modifications are intended to be included within the spirit and scope of the present disclosure and protected by the following claims.
Claims
1. A tool comprising:
- a handle portion and a distal portion;
- a sensor mounted on the distal portion, the sensor configured to generate sensor signals representing one or more characteristics of an object;
- a processing device configured to process the sensor signals to generate one or more output signals; and
- an output system mounted on the handle portion, the output system including one or more output mechanisms selected from the group consisting of a haptic device, a visual device, and an audio device;
- wherein the processing device is further configured to apply the one or more output signals to the one or more output mechanisms.
2. The tool of claim 1, further comprising a shaft connecting the handle portion with the distal portion.
3. The tool of claim 2, further comprising a rotary device mounted on the handle portion, the rotary device configured to rotate the sensor about an axis formed by the shaft.
4. The tool of claim 1, further comprising a set of switches.
5. The tool of claim 4, wherein one or more of the switches is configured to enable an operator to select which ones of the output mechanisms are enabled.
6. The tool of claim 5, wherein one or more of the switches is configured to enable the operator to adjust output parameters of the enabled output mechanisms.
7. The tool of claim 6, further comprising a level meter mounted on the handle portion, the level meter being configured to indicate how the parameters of the enabled output mechanisms are adjusted.
8. The tool of claim 1, wherein one of the output mechanisms is a visual device, the visual device comprising a liquid crystal display (LCD) screen.
9. The tool of claim 1, wherein one of the output mechanisms includes a device configured to provide both visual and haptic outputs.
10. The tool of claim 1, wherein the tool is a surgical tool and the object is a patient.
11. The tool of claim 10, wherein the surgical tool is a laparoscopic tool.
12. A method comprising:
- allowing an operator to manipulate a tool having a plurality of output mechanisms mounted thereon;
- sensing a property of an object located near or adjacent to a distal portion of the tool;
- processing the sensed property to obtain one or more output signals; and
- applying the one or more output signals to one or more of the output mechanisms.
13. The method of claim 12, wherein the method further comprises allowing the operator to enable or disable each of the one or more output mechanisms.
14. The method of claim 12, wherein the method further comprises allowing the operator to adjust parameters associated with the operation of the output mechanisms.
15. The method of claim 12, wherein applying the one or more output signals further comprises applying at least one visual signal to a video display device, applying at least one audio signal to an audio device, and applying at least one haptic signal to a haptic actuator.
16. The method of claim 12, wherein applying the one or more output signals further comprises applying output signals related to both visual and haptic parameters to a single output mechanism configured to provide both visual and haptic outputs.
17. A surgical tool comprising:
- means for obtaining a sensory signal indicative of a characteristic of a patient under observation, the obtaining means configured to obtain the sensory signal at a distal end of the surgical tool;
- means for processing the characteristic to generate one or more output signals; and
- means for providing the one or more output signals to one or more output devices configured to provide feedback information in multiple modes, wherein the multiple modes include at least a visual mode and a haptic mode.
18. The surgical tool of claim 17, wherein the means for obtaining the sensory signal is further configured to obtain multiple sensory signals, a first set of sensory signals comprising visual sensory signals and a second set of sensory signal comprising haptic sensory signals.
19. The surgical tool of claim 17, wherein the means for providing is further configured to provide the output signals in at least a visual mode, a haptic mode, and an auditory mode.
20. The surgical tool of claim 19, wherein the one or more output devices include at least one visual display device, at least one haptic actuator, and at least one audio device.
Type: Application
Filed: Dec 3, 2008
Publication Date: Jun 3, 2010
Applicant: Immersion Corporation (San Jose, CA)
Inventors: Christophe Ramstein (San Francisco, CA), Christopher J. Ullrich (Santa Cruz, CA), Danny A. Grant (Leval)
Application Number: 12/327,104