User interface devices for electrophysiology lab diagnostic and therapeutic equipment
In an electrophysiology (EP) lab, a bedside interface device allows an EP physician to directly control various diagnostic and therapeutic systems, including an electro-anatomic mapping system. The bedside interface device can include a computer with wireless communication capability as well as a touch-responsive display panel and voice recognition. The bedside interface device can also be a hand-graspable wireless remote control device that is configured to detect motions or gestures made with the remote control by the physician, allowing the physician to directly interact with the mapping system. The bedside interface device can also be a motion capture camera configured to determine motion patterns of the physician's arms, legs, trunk, face and the like, which are defined in advance to correspond to commands for the mapping system. The bedside interface device may also include voice recognition capabilities to allow a physician to directly issue verbal commands to the mapping system.
This application is a continuation patent application of U.S. patent application Ser. No. 13/208,924 (the '924 application), filed 12 Aug. 2011. The '924 application is hereby incorporated by reference in its entirety as though fully set forth herein.
BACKGROUND OF THE INVENTIONa. Field of the Invention
The instant disclosure relates generally to electrophysiology lab integration, and more particularly to user interfaces and devices therefore for electrophysiology lab diagnostic and therapeutic equipment.
b. Background Art
It is known to provide an electrophysiology lab in a medical facility. Such a lab may have use of a wide variety of diagnostic and therapeutic equipment useful in rendering medical service to a patient, such as imaging systems (e.g., fluoroscopy, intracardiac echocardiography, etc.), an electro-anatomic visualization, mapping and navigation system, ablation energy sources (e.g., radio frequency (RF) ablation generator), a recording system (e.g., for ECG, cardiac signals, etc.), a cardiac stimulator and the like. In a typical configuration, as seen by reference to
In conventional practice, an electrophysiology (EP) physician 16 is scrubbed into a sterile procedure and typically manipulates one or more catheters (not shown) in a sterile drape covered body of the patient 18. The physician's sterile gloved hands are typically engaged with the catheter handle and shaft next to the patient and he or she is therefore unable to directly make changes himself to any of the EP systems. The procedure room 10 typically includes one or more monitors (e.g., an integrated multi-display monitor 20 is shown) arranged so that the physician 16 can see the monitor 20 on which is displayed various patient information being produced by the diagnostic and therapeutic equipment mentioned above. In
For example, the EP physician 16 can verbally communicate (i.e., to the control technician—a mapping system operator) the desired view of the map to be displayed, when to collect points, when to separate anatomic locations, and other details of creating and viewing an anatomic map. The EP physician 16 can also communicate which signal traces to show, the desired amplitude, when to drop a lesion marker, and when to record a segment, to name a few. Where the technician is in a separate room, communication can be facilitated using radio.
While some commands are straightforward, for example, “LAO View”, “record that” and “stop pacing”, other commands are not as easy to clearly communicate. For example, how much rotation of a model the command “rotate a little to the right” means can be different as between the physician and the technician. This type of command therefore involves a question of degree. Also, depending on the physician-technician relationship, other requests related to the mapping system views and setup can be misinterpreted. For example, a request to “rotate right” may mean to rotate the model right (i.e., rotate view left) when originating from one physician but can alternatively mean rotate view right (i.e., rotate model left) when coming from another physician. This type of command therefore involves physician-technician agreement as to convention. Furthermore, implementation of requests for event markers, segment recordings, lesion markers and the like can be delayed by the time it takes the technician to hear, understand and act on a physician's command. Ambient discussions and/or equipment noise in and around the EP lab can increase this delay.
There is therefore a need for improvements in EP lab integration that minimize or eliminate one or more problems are set forth above.
BRIEF SUMMARY OF THE INVENTIONOne advantage of the methods and apparatuses described, depicted and claimed herein is that they provide an EP physician with the capability of directly controlling an EP diagnostic or therapeutic system, such as an electro-anatomic mapping system. This capability eliminates the need for the physician to first communicate his/her wishes to a control technician, who in turn must hear, interpret and act on the physician's command. The improved control paradigm results in reduced times for medical procedures.
A device for allowing a user to control an electro-anatomic mapping system includes an electronic control unit (ECU) and input means, using the ECU, for acquiring a user input with respect to a view of an anatomical model of at least a portion of a body of a patient. The user input is selected from the group comprising a user touch, a user multi-touch, a user gesture, a verbal command, a motion pattern of a user-controlled object, a user motion pattern and a user electroencephalogram. The ECU is configured to communicate the acquired input to the mapping system for further processing.
In an embodiment, the acquired user input can correspond to any of a variety of mapping systems commands, for example only at least one of: (1) creating a map with respect to the view; (2) collecting points with respect to the view; (3) segmenting regions by anatomy with respect to the view; (4) rotating the view; (5) enlarging or reducing a portion of the view; (6) panning the view; (7) selecting one of a plurality of maps for the view; (8) selecting a signal trace; (9) adjusting a signal amplitude; (10) adjusting a sweep speed; (11) recording a segment; (12) placing an event marker; (13) placing a lesion marker with respect to the view; (14) activating a replay feature of a stored, temporally varying physiologic parameter and (15) activating a replay of a stored video clip.
In an embodiment, the input means includes a touch-responsive display panel coupled to the ECU. The input means also includes user interface logic (executed by the ECU) configured to display a user interface on the touch-responsive display panel. The user interface logic is further configured to allow a user to interact with the touch-responsive panel for acquiring the above-mentioned user input with respect to the anatomical model. The user interface in combination with the touch-panel allows the user to provide input by way of touch, multi-touch, and gesture. In a further embodiment, the device further includes voice recognition logic configured to recognize a set of predefined verbal commands spoken by the user (e.g., the physician). In a still further embodiment, the device includes wireless communications functionality, improving portability of the device within a procedure room or the control room. In a still further embodiment, the user interface logic is configured to present a plurality of application-specific user interfaces associated with a plurality of different diagnostic or therapeutic systems. Through this capability, the user can rapidly switch between application-specific user interfaces (e.g., such as that for an electro-anatomic mapping system, an EP recording system, an ultrasound imaging system, a cardiac stimulator, etc.), while remaining bedside of the patient, and without needing to communicate via a control technician.
In another embodiment, the input means includes a remote control having a handle configured to be grasped by the user. The remote control includes logic configured to acquire the above-mentioned user input. The user input may include user-controlled motion patterns of the remote control, as well as user key-presses on the remote control. The device is also configured to communicate the acquired user input to the mapping system.
In yet another embodiment, the input means includes a motion capture apparatus configured to acquire imaging of movements of the user. The device includes logic configured to identify a motion pattern using the acquired imaging from the motion capture apparatus. The logic is further configured to produce a command, based on the identified motion pattern, and communicate the command to the electro-anatomic mapping system for further processing. The motion capture apparatus provides the capability of receiving input by way of physician gestures (e.g., hand, arm, leg, trunk, facial, etc.). In a further embodiment, the device further includes voice recognition logic configured to identify verbal commands spoken by the user.
Corresponding methods are also presented.
The foregoing and other aspects, features, details, utilities, and advantages of the present disclosure will be apparent from reading the following description and claims, and from reviewing the accompanying drawings.
Referring now to the drawings wherein like reference numerals are used to identify identical or similar components in the various views,
The base interface 28 is configured to interpret and/or facilitate directing the input acquired by the bedside interface device 26 to the appropriate one or more diagnostic and/or therapeutic systems (e.g., an electro-anatomic mapping system). In an embodiment, base interface 28 is centralized (as shown), wherein all communications with bedside device 26 occur through base interface 28. In a further embodiment, base interface 28 may be functionally distributed, wherein interface functions are located within each diagnostic or therapeutic system. In a still further embodiment, communications between bedside interface 26 and certain ones of the diagnostic/therapeutic systems can be centralized, while communications with other ones of the diagnostic/therapeutic systems can occur directly (i.e., separately).
The means or apparatus 24 addresses a number of the shortcomings of the conventional practice as described in the Background. For example, means or apparatus 24 allows the EP physician 16 to directly input levels of degree, for example, how much to rotate a view, as opposed to trying to verbally communicate “how much” to a control technician. Further, the use of means or apparatus 24 avoids the potential confusion that can sometimes occur between the EP physician and the control technician as to convention (i.e., does “rotate right” mean rotate the view or the model?). In addition, the use of means or apparatus 24 reduces or eliminates the inherent time delay between the time when the EP physician verbally issues a command and the time when the command is understood and acted upon by the technician.
With continued reference to
The fluoroscopic imaging system 30 may comprise conventional apparatus known in the art, for example, single plane or bi-plane configurations. A display area 48 that is shown on monitor 20 corresponds to the display output of fluoroscopic imaging system 30.
The intracardiac ultrasound and/or intracardiac echocardiography (ICE) imaging system 32 may also comprise conventional apparatus known in the art. For example, in one embodiment, the system 32 may comprise a commercial system available under the trade designation ViewMate™ Z intracardiac ultrasound system compatible with a ViewFlex™ PLUS intracardiac echocardiography (ICE) catheter, from St. Jude Medical, Inc. of St. Paul, Minn., USA. The system 32 is configured to provide real-time image guidance and visualization, for example, of the cardiac anatomy. Such high fidelity images can be used to help direct diagnosis or therapy during complex electrophysiology procedures. A display area 50 that is shown on monitor 20 corresponds to the display output of the ultrasound imaging system 32.
The system 34 is configured to provide many advanced features, such as visualization, mapping, navigation support and positioning (i.e., determine a position and orientation (P&O) of a sensor-equipped medical device, for example, a P&O of a distal tip portion of a catheter). Such functionality can be provided as part of a larger visualization, mapping and navigation system, for example, an ENSITE VELOCITY™ cardiac electro-anatomic mapping system running a version of EnSite NavX™ navigation and visualization technology software commercially available from St. Jude Medical, Inc., of St. Paul, Minn. and as also seen generally by reference to U.S. Pat. No. 7,263,397 entitled “METHOD AND APPARATUS FOR CATHETER NAVIGATION AND LOCATION AND MAPPING IN THE HEART” to Hauck et al., or U.S. Patent Publication No. 2007/0060833 A1 to Hauck entitled “METHOD OF SCALING NAVIGATION SIGNALS TO ACCOUNT FOR IMPEDANCE DRIFT IN TISSUE”, both owned by the common assignee of the present invention, and both hereby incorporated by reference in their entireties as though fully set forth herein. System 34 can be configured to perform further advanced functions, such as motion compensation and adjustment functions. Motion compensation may include, for example, compensation for respiration-induced patient body movement, as described in copending U.S. patent application Ser. No. 12/980,515, entitled “DYNAMIC ADAPTIVE RESPIRATION COMPENSATION WITH AUTOMATIC GAIN CONTROL”, which is hereby incorporated by reference in its entirety as though fully set forth herein. System 34 can be used in connection with or for various medical procedures, for example, EP studies or cardiac ablation procedures.
System 34 is further configured to generate and display three dimensional (3D) cardiac chamber geometries or models, display activation timing and voltage data to identify arrhythmias, and to generally facilitate guidance of catheter movement in the body of the patient. For example, a display area 52 that is shown on monitor 20 corresponds to the display output of system 34, can be viewed by physician 16 during a procedure, which can visually communicate information of interest or need to the physician. The display area 52 in
System 36 is configured to provide positioning information with respect to suitably configured medical devices (i.e., those including a positioning sensor). System 36 may use, at least in part, a magnetic field based localization technology, comprising conventional apparatus known in the art, for example, as seen by reference to U.S. Pat. No. 7,386,339 entitled “MEDICAL IMAGING AND NAVIGATION SYSTEM”, U.S. Pat. No. 6,233,476 entitled “MEDICAL POSITIONING SYSTEM”, and U.S. Pat. No. 7,197,354 entitled “SYSTEM FOR DETERMINING THE POSITION AND ORIENTATION OF A CATHETER”, all of which are hereby incorporated by reference in their entirety as though fully set forth herein. System 36 may comprise a gMPS™ medical positioning system commercially offered by MediGuide Ltd. of Haifa, Israel and now owned by St. Jude Medical, Inc. of St. Paul, Minn., USA. System 36 may alternatively comprise variants, which employ magnetic field generator operation, at least in part, such as a combination magnetic field and current field-based system such as the CARTO™ 3 System available from Biosense Webster, and as generally shown with reference to one or more of U.S. Pat. No. 6,498,944 entitled “Intrabody Measurement,” U.S. Pat. No. 6,788,967 entitled “Medical Diagnosis, Treatment and Imaging Systems,” and U.S. Pat. No. 6,690,963 entitled “System and Method for Determining the Location and Orientation of an Invasive Medical Instrument,” the entire disclosures of which are incorporated herein by reference as though fully set forth herein.
EP monitoring and recording system 38 is configured to receive, digitize, display and store electrocardiograms, invasive blood pressure waveforms, marker channels, and ablation data. System 38 may comprise conventional apparatus known in the art. In one embodiment, system 38 may comprise a commercially available product sold under the trade designation EP-WorkMate™ from St. Jude Medical, Inc. of St. Paul, Minn., USA. The system 38 can be configured to record a large number of intracardiac channels, may be further configured with an integrated cardiac stimulator (shown in
Cardiac stimulator 40 is configured to provide electrical stimulation of the heart during EP studies. Stimulator 40 can be provided in either a stand-alone configuration, or can be integrated with EP monitoring and recording system 38, as shown in
EP data editing/monitoring system 42 is configured to allow editing and monitoring of patient data (EP data), as well as charting, analysis, and other functions. System 42 can be configured for connection to EP data recording system 38 for real-time patient charting, physiological monitoring, and data analysis during EP studies/procedures. System 42 may comprise conventional apparatus known in the art. In an embodiment, system 42 may comprise a commercially available product sold under the trade designation EP-NurseMate™ available from St. Jude Medical, Inc. of St. Paul, Minn., USA.
To the extent the medical procedure involves tissue ablation (e.g., cardiac tissue ablation), ablation system 44 can be provided. The ablation system 44 may be configured with various types of ablation energy sources that can be used in or by a catheter, such as radio-frequency (RF), ultrasound (e.g. acoustic/ultrasound or HIFU), laser, microwave, cryogenic, chemical, photo-chemical or other energy used (or combinations and/or hybrids thereof) for performing ablative procedures. RF ablation embodiments may and typically will include other structure(s) not shown in
In the illustrated embodiment, the UI logic 64 is configured to present a plurality of application-specific user interfaces, each configured to allow a user (e.g., the EP physician 16) to interact with a respective one of a plurality of diagnostic and/or therapeutic systems (and their unique interface or control applications). As shown in
When a user selects one of the buttons in group 70, the UI logic 64 configures the screen display of computer 26a with an application-specific user interface tailored for the control of and interface with the particular EP system selected by the user. In
With continued reference to
The second group 72 of buttons includes a listing of common tasks performed by an EP physician when interacting with system 34. Each of the buttons in group 72 are associated with a respective task (and resulting action). For example, the five buttons in group 72 are labeled “Zoom In”, “Zoom Out”, “Add Lesion”, “Freeze Point”, and “Save Point”. The “Zoom In” and “Zoom Out” buttons allow the user to adjust the apparent size of the 3D model displayed on monitor 20 (i.e., enlarging or reducing the 3D model on the monitor).
For example,
Referring again to
Each of the buttons in group 74 are associated with a respective display mode, which alters the display output of system 34 to suit the wishes of the physician. For example, the three selectable buttons labeled “Dual View”, “Right View”, and “Map View” re-configure the display output of system 34, as will appear on monitor 20.
Each of the buttons in group 76 are associated with a respective viewpoint from which the 3D electro-anatomic model is “viewed” (i.e., as shown in window 52 on monitor 20). Three of the five selectable buttons, namely those labeled “LAO”, “AP”, and “RAO”, allow the user to reconfigure the view point from which the 3D electro-anatomic model is viewed (i.e., left anterior oblique, anterior-posterior, right anterior oblique, respectively). The remaining two buttons, namely those labeled “Center at Surface” and “Center at Electrode” allow the user to invoke, respectively, the following functions: (1) center the anatomy shape in the middle of the viewing area; and (2) center the current mapping electrode or electrodes in the middle of the viewing area.
The flattened joystick 78 is a screen object that allows the user to rotate the 3D model displayed in the window 52. In addition, as the point of contact (i.e., physician's finger) with the joystick object 78 moves from the center or neutral position, for example at point 83, towards the outer perimeter (e.g., through point 84 to point 86), the magnitude of the input action increases. For example, the acceleration of rotation of the model or cursor will increase. While
In a further embodiment, UI logic 64 can be further configured to present an additional button labeled “Follow Me” (not shown), which, when selected by the user, configures the electro-anatomic mapping system 34 for “follow me” control. This style of control is not currently available using a conventional keyboard and mouse interface. For “follow me” control, UI logic 64 is configured to receive a rotation input from the user via the touch panel (e.g., joystick 78); however, the received input is interpreted by system 34 as a request to rotate the endocardial surface rendering (the “map”) while maintaining the mapping catheter still or stationary on the display. In an embodiment, the physician can set the position and orientation of the mapping catheter, where it will remain stationary after the “Follow Me” button is selected.
Another feature of the touch panel computer 26a is that it incorporates, in an embodiment, voice recognition technology. As described above, computer 26a includes microphone 66 for capturing speech (audio) and voice recognition logic 68 for analyzing the captured speech to extract or identify spoken commands. The voice recognition feature can be used in combination with the touch panel functionality of computer 26a. The microphone 66 may comprise conventional apparatus known in the art, and can be a voice recognition optimized microphone particularly adapted for use in speech recognition applications (e.g., an echo-cancelling microphone). Voice recognition logic 68 may comprise conventional apparatus known in the art. In an embodiment, voice recognition logic 68 may be a commercially available component, such as software available under the trade designation DRAGON DICTATION™ speech recognition software.
In an embodiment, computer 26a is configured to recognize a defined set of words or phrases adapted to control various functions of the multiple applications that are accessible or controllable by computer 26a. The voice recognition feature can itself be configured to recognize unique words or phrases to selectively enable or disable the voice recognition feature. Alternatively (or in addition to), a button, such as button 80 in
Voice recognition logic 68 is configured to interact with the physician or other user to “train” the logic (e.g., having the user speak known words) so as to improve word and/or phrase recognition. The particulars for each user so trained can be stored in a respective voice (user) profile, stored in memory 62. For example, in
With continued reference to
It should be understood that variations in UI logic 64 are possible. For example, certain applications can be linked (in software) so that multiple applications can be controlled with a single command (e.g., the Record command). In another embodiment, UI logic 64 can be configured to provide additional and/or substitute functions, such as, without limitation, (1) map creation; (2) collecting points; (3) segmenting regions by anatomy; (4) map view (rotate and zoom); (5) select/manipulate a number of maps and view each; (6) selection of signal trace display; (7) adjust EP signal amplitude; (8) sweep speed; (9) provide single button (or touch, multi-touch, gesture) for recording a segment, placing an event marker, and/or placing a lesion marker.
It should be further understood that the screen layouts in the illustrative embodiment are exemplary only and not limiting in nature. The UI logic 64 can thus implement alternative screen layouts for interaction by the user. For example, while the screen displays in
In a still further embodiment, UI logic 64 can be configured for bi-directional display of information, for example, on the touch-responsive display panel. As one example, the “EnSite” user interface (
Since the wand system 26b is contemplated as being used in the sterile procedure room, multiple embodiments are contemplated for avoiding contamination. In this regard, wand system 26b may be configured with a disposable remote control portion 102, with a reusable remote control portion 102 that is contained within an enclosure compatible with sterilization procedures, with a reusable remote control portion 102 adapted to be secured in a sterilization-compatible wrapper, or with a reusable remote control portion 102 that is encased in a sterile but disposable wrapper.
With continued reference to
Either the remote 102 or the base interface 28b (or both, potentially in some division of computing labor) is configured to identify a command applicable to the one of the EP diagnostic/therapeutic systems, such as electro-anatomic mapping system 34, based on the detected motion of the remote 102. Alternatively, the command may be indentified based on a key press, or a predetermined motion/key press combination. Once the remote 102 and/or interface 28b indentifies the command it is transmitted to the appropriate EP system. In an electro-anatomic mapping system embodiment, the wireless remote control 102 is configured to allow an EP physician to issues a wide variety of commands, for example only, any of the commands (e.g., 3D model rotation, manipulation, etc.) described above in connection with touch panel computer 26a. By encoding at least some of the control through the wireless remote control 102 that the EP physician controls, one or more of the shortcomings of conventional EP labs, as described in the Background, can be minimized or eliminated. As with touch panel computer 26a, electronic wand system 26b can reduce procedure times as the EP physician will spend less time playing “hot or cold” with the mapping system operator (i.e., the control technician), but instead can set the display to his/her needs throughout the medical procedure.
The motion capture apparatus 26d includes the capability to detect hand/arm/leg/trunk/facial motions (e.g., gestures) of the EP physician or other user and translate the detected patterns into a desired command. Apparatus 26d also includes audio capture and processing capability and thus also has the capability to detect speech and translate the same into desired commands. In an embodiment, apparatus 26d is configured to detect and interpret combinations and sequences of gestures and speech into desired commands. The base interface 28b is configured to communicate the commands (e.g., rotation, zoom, pan of a 3D anatomical model) to the appropriate EP diagnostic or therapeutic system (e.g., the electro-anatomic mapping system 34). In an embodiment, the motion capture apparatus 26d may comprise commercially available components, for example, the Kinect™ game control system, available from Microsoft, Redmond, Wash., USA. A so-called Kinect™ software development kit (SDK) is available, which includes drivers, rich application programming interfaces (API's), among other things contents, that enables access to the capabilities of the Kinect™ device. In particular, the SDK allows access to raw sensor streams (e.g., depth sensor, color camera sensor, and four-element microphone array), skeletal tracking, advanced audio (i.e., integration with Windows speech recognition) as well as other features.
Since there is no contact contemplated by EP physician 16 during use of motion capture apparatus 26d, contamination and subsequent sterilization issues are eliminated or reduced. In addition, the lack of contact with apparatus 26d for control purposes allows the EP physician to keep his hands on the catheter or other medical device(s) being manipulated during an EP procedure. By encoding at least some of the control through the motion capture apparatus 26d, with which the EP physician interacts, one or more of the shortcomings of conventional EP labs, as described in the Background, can be minimized or eliminated. As with the previous embodiments, the motion capture apparatus 26d can reduce procedure times.
It should be understood that variations are possible. For example, the motion capture apparatus 26d can be used in concert with sensors and/or emitters in a sterile glove to assist the apparatus 26d to discriminate commands intended to be directed to one of the EP systems, versus EP physician hand movements that result from his/her manipulation of the catheter or medical device, versus other movement in the EP lab in general. In another embodiment, the motion capture apparatus 26d may discriminate such commands by being “activated” by a user when a specific verbal command is issued (e.g., “motion capture on”) and then “deactivated” by the user when another specific verbal command is issued (e.g., “motion capture off”).
It should be understood that variations are possible. For example, in a further embodiment, primary control by the physician in manipulating or interacting with the mapping system may be through use of voice control alone (i.e., a microphone coupled with voice recognition logic), apart from its inclusion with other modes or devices for user interaction described above. In a still further embodiment, the physician can be equipped with headgear that monitors head movements to determine at what location on the screen/monitor the physician is looking. In effect, such headgear can act as a trackball to move or otherwise manipulate an image (or view of a model) on the monitor in accordance with the physician's head movements. In a yet further embodiment, the physician can be equipped with headgear that monitors head movements and/or also monitors brainwave patterns (e.g., to record a user electroencephalogram (EEG)). Such monitored data can be analyzed to derive or infer user input or commands for controlling an image (or view of a model), as described above. An EEG-based embodiment may comprise conventional apparatus known in the art, for example, commercially available products respectively sold under the trade designation MindWave™ headset from NeuroSky, Inc., San Jose, Calif., USA, or the Emotiv EPOC™ personal interface neuroheadset from Emotiv, Kwun Tong, Hong Kong. In a still further embodiment, the physician can be equipped with an eye tracking apparatus, wherein monitored eye movements constitute the user input to be interpreted by the system (e.g., the eye movements can be interpreted as a cursor movement or other command).
It should also be appreciated that while the foregoing description pertains to an EP physician manually controlling a catheter through the use of a manually-actuated handle or the like, other configurations are possible, such as robotically-actuated embodiments. For example, a catheter movement controller (not shown) described above may be incorporated into a larger robotic catheter guidance and control system, for example, as seen by reference to U.S. application Ser. No. 12/751,843 filed Mar. 31, 2010 entitled ROBOTIC CATHETER SYSTEM (published as U.S. patent application publication no. 2010/0256558), owned by the common assignee of the present invention and hereby incorporated by reference in its entirety as though fully set forth herein. Such a robotic catheter system may be configured to manipulate and maneuver catheters within a lumen or a cavity of a human body, while the bedside interface devices described herein can be used to access and control the EP diagnostic and/or therapeutic systems. In at least one embodiment, a bedside interface device as described herein may also be used to access and control the robotic catheter system.
In accordance with another embodiment, an article of manufacture includes a computer storage medium having a computer program encoded thereon, where the computer program includes code for acquiring user input based on at least one of a plurality of input modes, such as by touch, multi-touch, gesture, motion pattern, voice recognition and the like, and identifying one or more commands or requests for an EP diagnostic and/or therapeutic system. Such embodiments may be configured to execute one or more processors, multiple processors that are integrated into a single system or are distributed over and connected together through a communications network, and where the network may be wired or wireless.
It should be understood that while the foregoing description describes various embodiments of a bedside interface device in the context of the practice of electrophysiology, and specifically catheterization, the teachings are not so limited and can be applied to other clinical settings.
It should be understood that the an electronic control unit as described above may include conventional processing apparatus known in the art, capable of executing preprogrammed instructions stored in an associated memory, all performing in accordance with the functionality described herein. It is contemplated that the methods described herein may be programmed, with the resulting software being stored in an associated memory and where so described, may also constitute the means for performing such methods. Implementation of an embodiment of the invention, in software, in view of the foregoing enabling description, would require no more than routine application of programming skills by one of ordinary skill in the art. Such a system may further be of the type having both ROM, RAM, a combination of non-volatile and volatile (modifiable) memory so that the software can be stored and yet allow storage and processing of dynamically produced data and/or signals.
Although numerous embodiments of this invention have been described above with a certain degree of particularity, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the spirit or scope of this invention. All directional references (e.g., plus, minus, upper, lower, upward, downward, left, right, leftward, rightward, top, bottom, above, below, vertical, horizontal, clockwise, and counterclockwise) are only used for identification purposes to aid the reader's understanding of the present invention, and do not create limitations, particularly as to the position, orientation, or use of the invention. Joinder references (e.g., attached, coupled, connected, and the like) are to be construed broadly and may include intermediate members between a connection of elements and relative movement between elements. As such, joinder references do not necessarily infer that two elements are directly connected and in fixed relation to each other. It is intended that all matter contained in the above description or shown in the accompanying drawings shall be interpreted as illustrative only and not limiting. Changes in detail or structure may be made without departing from the spirit of the invention as defined in the appended claims.
Claims
1.-20. (canceled)
21. A device for allowing a user to communicate with a plurality of electrophysiological systems, comprising:
- an electronic control unit, a display panel, and a microphone;
- user interface logic stored in a memory configured to be executed by said electronic control unit and configured to display on said display panel a user interface which includes a first group of buttons corresponding to a plurality of electrophysiological (EP) diagnostic and therapeutic systems; and
- voice recognition logic stored in said memory configured to be executed by said electronic control unit and configured to analyze user speech input captured by said microphone to identify a user-spoken command;
- wherein said user interface logic is configured to allow the user to select one button from said first group of buttons according to said identified command to thereby select a corresponding one of said plurality of EP systems and to present, in response to said user selection, an application-specific user interface on said display panel that enables access to and control of said one user-selected EP system while maintaining said display of said first group of buttons;
- wherein said user interface logic is further configured to allow the user to interact with said application-specific user interface, said voice recognition logic being configured to identify a further spoken command with respect to said application-specific user interface wherein said electronic control unit is configured to communicate said further spoken command to the user-selected one EP system.
22. The device of claim 21 where said application-specific user interface of said one user-selected EP system comprises at least a second group of buttons displayed on said display panel that is different from said first group of buttons.
23. The device of claim 21 wherein said user interface logic is configured to present on said display panel, for each one of said plurality of EP systems when selected by the user, a respective application-specific user interface that enables access to and control of said user-selected EP system.
24. The device of claim 21 wherein said display panel comprises a touch-responsive display panel, and wherein said user interface logic is configured to receive from the user a user touch input from said touch-responsive display panel, said user interface logic being further configured to allow the user to select one button from said first group of buttons according to said user touch input.
25. The device of claim 21 wherein said user interface logic is further configured to switch between respective application-specific user interfaces via a common interface displayed on said display panel, wherein said common interface includes said first group of buttons.
26. The device of claim 21 wherein said electronic control unit communicates said identified command wirelessly.
27. The device of claim 21 wherein the user interface logic is further configured to alter an appearance of said one user-selected button of said first group of buttons so as to be visually distinguishable from remaining, non-selected buttons of said first group of buttons, thereby visually indicating to the user which corresponding EP system has been selected.
28. The device of claim 27 wherein said one user-selected button is altered so as to have one of a depressed appearance and shaded appearance.
29. The device of claim 21 further comprising a user profile stored in said memory and associated with the user, wherein said voice recognition logic is further configured to identify said command using said user profile.
30. The device of claim 29 wherein the user is a first user and the user profile is a first user profile, further comprising a second user and a second user profile associated with the second user wherein said second user profile is stored in said memory.
31. The device of claim 30 wherein said voice recognition logic is configured to identify a spoken command of the second user by using said second profile.
32. The device of claim 30 wherein each of the first and second users have unique commands associated therewith stored in respective first and second user profiles.
33. The device of claim 30 wherein the currently active user profile is displayed on said display panel.
34. The device of claim 21 wherein said user interface logic is further configured to allow the user to enable or disable the operation of said voice recognition logic.
35. The device of claim 21 further comprising a sterile drape configured to protect said display panel from contamination.
36. The device of claim 21 wherein said plurality of EP systems includes an electro-anatomic mapping system, an EP monitoring and recording system, a cardiac stimulator, an EP data editing system, a medical positioning system, and an imaging system.
37. A device for allowing a user to communicate with a plurality of electrophysiological systems, comprising:
- an electronic control unit, a touch-responsive display panel, and a microphone;
- user interface logic stored in a memory configured to be executed by said electronic control unit and configured to display on said display panel a user interface which includes a common interface comprising a first group of buttons corresponding to a plurality of electrophysiological (EP) diagnostic and therapeutic systems, said user interface logic is further configured to receive from the user a user touch input from said touch-responsive display panel;
- voice recognition logic stored in said memory configured to be executed by said electronic control unit and configured to analyze user speech input captured by said microphone to identify a user-spoken command;
- wherein said user interface logic is configured to allow the user to select one button from said common interface using one of (i) said user interface logic according to said user touch, and (ii) said voice recognition logic according to said identified user-spoken command, to thereby select a corresponding one of said plurality of EP systems and to present, in response to said user selection, an application-specific user interface on said display panel that enables access to and control of said one user-selected EP system while maintaining said display of said common interface;
- wherein said user interface logic is further configured to allow the user to interact with said application-specific user interface, said voice recognition logic being configured to identify a further spoken command with respect to said application-specific user interface wherein said electronic control unit is configured to communicate said further spoken command to the user-selected one EP system.
38. A device for allowing a user to communicate with a plurality of electrophysiological systems, comprising:
- an electronic control unit coupled to a memory;
- a microphone;
- an input means for acquiring a user input comprising a display panel;
- user interface logic stored in said memory configured to be executed by said electronic control unit and configured to display on said display panel a user interface which includes a first group of buttons corresponding to a plurality of electrophysiological (EP) diagnostic and therapeutic systems; and
- voice recognition logic stored in said memory configured to be executed by said electronic control unit and configured to analyze user speech input captured by said microphone to identify a user-spoken command;
- wherein said user interface logic is configured to allow the user to select one button from said first group of buttons according to said identified command to thereby select a corresponding one of said plurality of EP systems and to present, in response to said user selection, an application-specific user interface on said display panel that enables access to and control of said one user-selected EP system while maintaining said display of said first group of buttons;
- wherein said user interface logic is further configured to allow the user to interact with said application-specific user interface and obtain a further command taken with respect to said application-specific user interface, and wherein said electronic control unit is configured to communicate said further command to said user-selected one EP system.
Type: Application
Filed: May 2, 2016
Publication Date: Nov 3, 2016
Inventors: Charles Bryan Byrd (Oakdale, MN), Eric Betzler (Andover, MN), Sandeep Dani (Eden Prairie, MN), Israel A. Byrd (Richfield, MN), Eric S. Olson (Maplewood, MN)
Application Number: 15/144,135