System and Technique To Document A Patient Encounter
A technique includes displaying an anatomical model of a human on a display of a processor-based machine; and in the processor-based machine, input data acquired from a user interacting with the displayed anatomical model is processed to provide or update an electronic health record in response to an encounter between the clinician and a patient. The processing includes selecting an interface from a plurality of interfaces based on user interaction with the displayed anatomical model as a result of the encounter, displaying the selected interface and gathering input data in response to the user interaction with the selected interface. The technique includes determining content to be included in the electronic health record based at least in part on the input data.
A clinician typically documents an encounter with a patient so that the documented encounter becomes part of the patient's medical record. One method of documenting a patient encounter involves the creation of a SOAP note, which is an acronym for a subjective, objective, assessment and plan. In this regard, the subjective component of the SOAP note documents the patient's chief complaint, i.e., the brief statement of the patient as the purpose of the visit to the clinician; the objective component pertains to the clinician's observations of the patient; the assessment of the SOAP note is the clinician's assessment of the main symptoms and/or diagnosis for the patient; and the plan refers to the treatment plan that is provided by the clinician.
Systems and techniques are disclosed herein for purposes of enhancing and facilitating the documentation of an encounter between a patient and a healthcare provider (physicians, physician assistants (PAs), family nurse practitioners (FNPs), and so forth). In the context of this patent application, the healthcare provider may be considered a clinician in that the provider has direct contact with and treats the patient. Moreover, in the context of this application, a clinician may, in general, refer to any healthcare worker who interacts and treats a patient, such as a physician, physician assistant (PA), family nurse practitioner (FNP), surgeon, and so forth
More specifically, in accordance with example implementations, the clinician uses a tool (called “the conduit tool 110” herein) that presents a graphical user interface (GUI) 111 that is constructed to efficiently document a patient encounter as well as efficiently provide access to the patient's medical history as part of the documentation process. As described herein, the conduit tool 110 accesses a three-dimensional (3-D) anatomical human model 112 to 1.) display images corresponding 3-D anatomical human models and aspects of the models related to documenting a patient encounter; 2.) gather input from the clinician in response to the clinician's interaction with the displayed images; and 3.) update/generate an electronic health record (EHR) for the patient in response to this interaction. In this manner, the clinician may use the conduit tool 110 for such purposes as efficiently gathering subjective, objective, assessment, and plan (SOAP) notes; selecting the affected human body system(s) relevant to a given patient encounter; marking regions of the human body associated with patient complaint(s); tracking multiple complaints from a given patient associated with multiple body systems/regions; and tracking patient complaints being treated over time; as just a few examples.
For the example implementation of
The conduit tool 110 may be, as examples, be a portable, or hand-held electronic device, such as a tablet, a portable computer or a smartphone; a thin client; a desktop computer; and so forth. Thus, many variations are contemplated, which are within the scope of the appended claims.
As illustrated in
As depicted in
In general, the EHR interfacing application 116 may communicate with the EHR database 120 over network fabric 118 that contains various gateways, routers, switches and so forth, as can be appreciated by one of ordinary skill in the art. Communication between the EHR interfacing application 116 and the EHR database 120 may thus involve one or more of the following (as examples): wifi communication links, wired Ethernet communication links, local area network (LAN) communication links, cellular data communication links, Bluetooth communication links; wide area network (WAN) communication links; Internet-based communication links, and so forth.
As will become apparent in the following discussion, the local system 104 may have additional components that are not specifically illustrated in
A primary use of the conduit tool 110, in accordance with example implementations, is to chart a patient's visit during an initial or follow up consultation. Specifically, in accordance with example implementations, the conduit tool 110 meets one or more of the following goals: supports charting without losing face time with the patient; allows for quick charting relative to a patient's symptoms; allows for quick charting relative to the indicated area of complaint; allows for partial charting of a visit with the ability to edit the chart later after the patient leaves the clinician's office; enhances the patient's visit by using the conduit tool 110 as a visualization tool while discussing the patient's complaint; and provide a tool that works in alignment with the way the clinician works.
Other and different goals may be met using the conduit tool 110, depending on the particular implementation. For example, in accordance with further example implementations, the conduit tool 110 may be distributed in variety of versions, each customized to specifically address one particular medical field. Each version, in these example implementations, leverages the GUI and 3-D anatomical view of the human body to enhance one or more of the work flows that are particular to the field. For example, a surgeon may have different needs than a family practitioner, but the surgeon each might gain from the view and streamline approach provided by conduit tool 110, albeit at different levels of resolution and with different user interface demands, than are discussed in the examples herein.
In accordance with example implementations, the conduit tool 110 may execute, or run, as an extension to an EHR by allowing the clinician to chart a patient's visit from the point of view of the anatomy most closely associated with the patient's complaint. When working with the conduit tool 110, the clinician first identifies the part of the body affected and charts from that reference point. However, the clinician is not obligated to identify that part explicitly, as the clinician may identify a given part implicitly by the symptom or suspected diagnostic condition of the patient.
Thus, referring to
The conduit tool 110 provides a list of symptoms that can be used to quickly reference a part of the body and start the charting process. Each symptom in the list is associated with a particular part or system of the body as represented in the model. A symptom may be identified by the patient or by the attending clinician. When a symptom is selected, the associated body part or system is highlighted in the model and the charting process begins.
The conduit tool 110, via the GUI 111, provides a list of diagnostic conditions that may be used to quickly reference a part of the body and begin the charting process. Each condition in the list is associated with a particular part or system of the body as represented in the model. A condition may be speculated by either the patient or by the attending clinician as a possible match to the symptoms seen. When a condition is selected, the associated body part or system is highlighted in the model and the charting process begins.
As an example,
In accordance with example implementations, the conduit tool 110 provides a tree view of the anatomy within the model. This tree view may be used to select systems or individual body parts within a system. In this manner, in response to the clinician selecting a piece, or portion, of anatomy in the tree view, the GUI 111 highlights that part in the model. Body systems may also be selected. In response to the clinician selecting a system in the tree view, the GUI 111 highlights all anatomy associated with that system. After the tree selection is made, the charting process may then begin, in accordance with example implementations.
To support the goal of providing clinicians with a tool that aligns with the way they work, the conduit tool 110 divides the skin of the model into a set of selectable areas, which represent a high level dissection of the body. As can be appreciated by the clinician, these selectable areas are generally referred to by patients when the patients discuss their complaints.
In accordance with example implementations, the conduit tool 110 breaks up the skin of the model into areas that are depicted in the GUI 111 as anterior and posterior body map images for purposes of allowing the clinician to select the appropriate area in response to corresponding patient complaint(s). These skin body map images are also referred to herein as “skin maps.” In accordance with example implementations, the GUI 111 is constructed to display an anterior skin map 400 (
In accordance with example implementations, there may be 156 specific areas of the skin, which are represented in the anterior 400 and posterior 420 skin maps. Moreover, the anterior 400 and posterior 420 skin maps may be 3-D images, in accordance with example implementations
In accordance with example implementations, via the GUI 111, the conduit tool 110 graphically color codes the skin breakdown according to the symptoms generally associated with each area. In this manner, referring to
In accordance with example implementations, the skin maps that are used by the conduit tool 110 may be created using a modeling application like Maya® from Autodesk. Adding intelligence to the model can be achieved, in accordance with example implementations, using an application such as Unity 3D. The result is a set of data that is stored in the local system 104 (
As a more specific example, in accordance with some implementations, the skin maps that are used by the conduit tool 110 may be created as follows. First, two skin-only models (one male and one female (skin only) are created. In accordance with some implementations, the GUI 111 may display child skin maps, as well. Therefore, creating the skin maps may involve creating male and female adult skin models, as well as create male and female child skin models. The models may be created using Maya, in accordance with example implementations.
Next, groups of polygons within each model are assigned to respective areas of the model, such as the areas depicted in
A “call back” is then created for the conduit tool 110, serving as the notification that a polygon and more specifically, an area of the skin map has been selected. When a polygon within the model is selected, the call back determines which area has been selected by querying the texture map on the polygon. A call back notification may be accomplished, for example, using a mesh collider.
The conduit tool 110 is next configured to highlight a given area of the skin map when the area is selected. Configuring the conduit tool 110 to highlight may be accomplished by changing the texture coloration of the mesh, in accordance with example implementations.
The mapping between the areas defined by the skin map and the symptoms in the symptom list that are common to each area may then be mapped. In accordance with example implementations, the conduit tool 110 applies a filter that restricts the list of possible symptoms that can be associated with a complaint based on the area of the skin map that has been selected. The conduit tool 110 may perform this filtering using database schema, as further described below.
A mapping is created between the conditions in the conditions list and the symptoms in the symptoms list. This mapping identifies a condition by the symptoms that are associated with it. The conduit tool 110 may perform this mapping using database schema, as further described below.
Finally, the skin map may be associated with markers called “PINs” herein (and further described below), which are opened in the GUI 111 when an area (a 3-D area, for example) of the skin map is selected. In accordance with example implementations, the creation of the PINS for the conduit tool 110 may be accomplished using, for example, a popover window.
When working with a skin map in the GUI 111, the clinician may follow these steps (in accordance with an example implementation). The clinician selects the patient's complaint area on the skin map. The clinician may then create a PIN for the selected complaint area, with a list of common symptoms/complaints being associated with the area (as highlighted by the GUI 111).
As described further below, a selection from the PIN menu opens a view in the detail area of the screen that will allow creation of a SOAP note from the genre selected in the PIN list of associated symptoms/complaints. In accordance with example implementations, when a sufficient number of symptoms have been charted, the conduit tool 110 posts a potential condition under a diagnosis tab of the PIN.
The conduit tool 110 allows the clinician to work in any of several modes. In a first mode, the clinician may explicitly identify and select the complaint location at a very detailed and anatomically correct level. This mode takes advantage of the model preciseness and navigability provided by the conduit tool 110. Both the 3-D anatomical human model image and the master tree view image provided by the GUI 110 support this mode, in accordance with example implementations.
In another mode, the clinician may work from a symptom or suspected condition to identify a location and then use the conduit tool 110 to begin the charting process.
In another use of the conduit tool 110, the clinician may work in a mode that's appropriate when the initial conversation with a patient is at a general level that does not directly point to some specific anatomy, but instead gives a general indication of the location of the complaint's origin. This represents the place where the patient hands off his complaint to the clinician for professional assessment. The skin map supports this mode of operation.
Often, when the clinician first begins talking with a patient he/she may want to work with the model from this generalized segmented skin map and dive down from there as needed for more precision. This is because patients generally discuss their complaints from this this vantage point to begin with.
As noted above, the PIN is a marker that is displayed as part of the GUI 111 and identifies a pointer of interest on the anatomical human model. In this manner, the conduit tool 110 uses the PIN to identify a point of interest, i.e., the location of a complaint.
The purpose of the PIN in the conduit tool 110 is to provide a relative location for the clinician to initiate the charting of a complaint. The PIN provides a set of complaints/symptoms common for the area selected on the model. Selecting one of these open the detail area of the screen for entering the charting data associated with the history, physical exam, diagnosis and plan of a complaint. In accordance with example implementations, a PIN may be created, deleted, edited or relocated to another part of the human anatomical model.
In accordance with example implementations, a PIN has two display modes: an expanded mode and a collapsed mode. In the expanded mode, the entire PIN (i.e., the entire dialog window) is displayed and is active in the GUI 111 for purposes of soliciting clinician input. The expanded mode is used when selecting a symptom/complaint, as further described herein. The expanded display mode of the PIN is illustrated in example PINs 7A, 7B, 7C, 7D and 7E, which are further described below. In the collapsed display mode of the PIN, only the title bar of the PIN is possibly displayed, in accordance with example implementations. In the collapsed display mode, the PIN is not being used for charting.
In accordance with example implementations, the conduit tool 110 supports two pinning modes. In the first pinning mode, conduit tool 110 creates the PIN in response to the clinician explicitly clicking or touching a part of the anatomical human model that is displayed in the GUI 11 to create and place the PIN. In another pinning mode, the conduit tool 110 implicitly creates and places a PIN in response to the clinician identifying a part of the anatomical human model by one of the following actions: choosing the part through the master tree view; selecting a body area in the skin map; selecting a symptom in the symptom list (which automatically selects the location associated with the symptom); and selecting a potential condition in the condition list (which automatically selects the location associated with the condition). This, second, implicit pinning mode for generating a PIN may be the mode most likely to be used by a clinician when charting a patient, in accordance with example implementations.
The title bar of the PIN window 712-1 displays the location 715 of the complaint (here, the “left front rib”). The title bar serves to identify the PIN 712-1 (and its associated complaint) for future reference when the PIN has been collapsed and only the title bar is visible. The end of the association arrow 701 closest to the body part may be used to move the PIN from one body part to another. This function may be achieved, in accordance with some implementations, by using a relocation button 738 of the window 712-2.
The PIN window 712 further includes an OK button 734 that collapses the PIN window 712 down to just the title bar and an edit 736 button, in accordance with example implementations. The PIN window 712 further includes a cancel button 740, which allows the PIN to be closed without saving any edits made after the PIN window 712 was opened. The cancel button 740 may also be used to reverse the creation of a PIN that was made in error. As noted, the edit button 736 may be visible, in accordance with example implementations, when the PIN is collapsed. Pushing on the edit button 736 expands the PIN window 712 for charting.
The relocate button 738 is used to place the PIN in relocation mode, which allows movement of the PIN. In this manner, after depressing the relocate button 738, the clinician is free to move the end of the association arrow 701 closest to the body part currently associated with the PIN. The clinician may then drag and drop that point onto the new body part to be associated with the PIN.
Beneath the title bar of the PIN window 712, there are two tab menu rows, in accordance with example implementations. These rows are used to specify the chart to be edited by the clinician. In accordance with example implementations, the first row contains four tabs identifying four areas of charting for the complaint: a history tab 704, which is selected to chart, or enter, information describing the history of the complaint from the patient's point of view; a physical exam tab 706, which is selected to enter information describing the physical examination that is conducted by the clinician; a diagnosis tab 708, which is selected to enter information describing the diagnosis that is made by the clinician; and a plan tab 710, which is selected to enter information describing the treatment plan that is prepared by the clinician.
The conduit tool 110 dynamically adjusts which menu tabs are displayed on the second row based on the selection made in the first row. As illustrated in
The selection of the menu tabs of the second row control which charting panels are displayed by the conduit tool 110 in the PIN window 712 below the second row. For the example PIN window 712-1 that is depicted in
For
For
For
For
As noted above, the conduit tool 110 displays other sets of second row menu tabs based on whether the history tab 704, physical exam 706, diagnosis tab 708 or plan tab 710 in the first row is selected. As a further example, in response to the physical exam tab 706 being selected, the conduit tool 110 may display the following tabs for the second row menu: a location tab, which is selected to display panels to chart elements particular to the body part or system associated with the PIN; an inspection tab, which is selected to display panels to allow the clinician to chart the complaint from the visual aspects seen by the clinician; a palpation tab, which is selected to display panels to allow the clinician to chart the complaint from the skin sensory aspects that are felt by the clinician; and an “other evaluations” tab; which is selected by the clinician to display panels to chart the complaint from the other aspects that are perceived or collected by the clinician.
In a similar manner, the conduit tool 110 is constructed to display other sets of second row menu tabs and corresponding panels in response to selection of the diagnosis tab 708 or plan tab 710 of the first menu row.
As examples, conduit tool 110 may supports one or more of the following use cases for purposes of documenting a patient encounter and generating/modifying the corresponding EHR. A patient, waiting in the lobby, is called, enters the visitation room, and the nurse initiate the encounter. In this manner, the nurse walks into the visitation room to see the patient. The nurse picks up a tablet computer that serves a platform (for this example) for the conduit tool 110, which is connected to the practice's EHR for the patient. The nurse interacts with the conduit tool 110 via its GUI 111. The nurse logs into the conduit tool 110 and therefore into the practice's EHR. The nurse identifies the patient in the conduit tool 110 and starts the encounter. The conduit tool 110 syncs with the practice's EHR and imports the patient identification (ID). The nurse enters the date of visit. The nurse asks patient what he/she is being seen for and enters that as a text note into conduit tool 110. The nurse takes patient vitals and enters the vitals into the conduit tool 110. The nurse locks the conduit tool 110. The patient encounter remains intact.
Next, the clinician enters the visitation room. The clinician picks up the tablet computer platform for the conduit tool 110, as left by the nurse, and the clinician unlocks the conduit tool 110. The clinician resets conduit tool 110 so that the skin map is displayed on the GUI 111. The clinician reviews the note left by the nurse and reviews vitals taken by nurse.
The complaint location is next identified. The patient identifies his complaint relative to a general area of the body. In this manner, clinician identifies the general area of the complaint using the conduit tool 110. The identification of the general area of the complaint may be accomplished in one of two ways, in accordance with example implementations. In this first way, the clinician may touch the general area on the skin map that is displayed on the GUI 111. Alternatively, the clinician may select the general area from the master tree view that is displayed in the GUI 111 by the conduit tool 110.
Next, the patient identifies his complaint relative to a specific system part or system of the body. The clinician identifies that part or system in the conduit tool 110. This can be done in one of two ways, in accordance with example implementations. In the first way, the clinician touches that part or system on the more detailed anatomical human model that appears below the skin map in the GUI 111. Alternatively, the clinician selects that part or system from the master tree view in the GUI 111.
The patient then identifies his complaint relative to a symptom. In this manner, the patient identifies the main symptom of his/her complaint. The clinician selects the symptom from the symptom list in the GUI 111 if the symptom is in the list. The conduit tool 110 is constructed to highlight a specific system or part of the body as associated with the symptom and therefore the complaint.
The patient then identifies his/her complaint relative to a condition. The clinician or the patient makes a pre-diagnosis and identifies the suspected condition underlying the patient's complaint. The clinician selects the condition from the condition list that is displayed in GUI 11 if the condition is in the list. The conduit tool 110 is constructed to highlight a specific system or part of the body as associated with the condition and therefore the complaint.
The clinician is then ready to create a PIN is created for the complaint. A PIN for the complaint is created in either of two ways. Following any of the uses, a PIN is implicitly and automatically created and associated with the part of the body identified by one of those use cases. This implicit creation may be occur, for example, through selection of an area of the skin map or through selection of part of the body's underlying anatomy. Using the pinning tool that is provided by the conduit tool 110, the clinician may alternatively explicitly create a PIN.
After its creation, the PIN is displayed so that information pertaining to the complaint may be charted by the clinician. Upon creation, the PIN's dialog box, or window, defaults to the charting panels associated with the history and location tabs being selected (see, for example,
Next, the clinician charts the patient visit using the PIN window. In the case where the clinician has decided to enter data in the PIN, he does so now. The clinician is free to move between any of the history tabs while collecting data, but usually goes from left to right through the tabs in the following sequence: location tab; sensation tab; inspection tab; palpation tab; and movement Tab. The clinician completes charting and closes the PIN by selecting the OK button on the PIN interface.
The clinician may edit a given PIN to update a patient's chart at later time after the patient leaves. In this manner, the clinician is not obligated to enter all documentation about a complaint at any given time, and the clinician may create a PIN, close it, then come back to it at another time to chart the patient information. Thus, the conduit tool 110 supports editing of an existing PIN. To perform the editing, the clinician identifies the PIN that holds the chart the clinician desired to modify. The clinician selects that PIN in the GUI 111 by clicking, tapping or pushing on it. The clinician pushes the edit button on the PIN window, and in accordance with example implementations, the PIN window opens to the charting panel with the data that was entered in that panel. The clinician completes his/her updates and closes the PIN by selecting the OK button on the PIN window.
The clinician may relocate a given PIN to a different body part. In this manner, the clinician is not obligated to identify the exact location of a complaint at any given time. The clinician may create a PIN at a location, close the PIN, and then wish to refine the location associated with the PIN at another time. In accordance with example implementations, the conduit tool 110 supports relocation of an existing PIN. To perform the relocation, the clinician identifies the PIN to be relocated, selects that PIN in the GUI 111 by clicking, tapping or pushing on it. The PIN window then expands. One way for the clinician to complete the relocation is for the clinician to push the relocate button and drag the PIN window to the new location. Other way for the clinician to complete the relocation is for the clinician to the “anatomy end” of the association arrow for the PIN to piece of the anatomy that it is associated with. After relocating the PIN, the clinician may close the PIN by selecting the OK button on the PIN window.
Referring to
Each of the PIN tab schema may be connected to corresponding charting panel schema. For example, referring to
The diagnosis tab schema 832 and plan tab schema 834 also associated charting panel schema, in accordance with example implementations.
In accordance with example implementation, the conduit tool 110, via the GUI 111, may further display a timeline, which is an easily scrollable mechanism for finding a past visit. Each node on the timeline represents a visit made by the same patient that is being seen currently. When the clinician slides over to a node, the clinician is presented with all the PINs of that visit. Some useful ramifications may be one or more of the following. The clinician may see how a complaint or condition has progressed over time. Frequency of complaints or conditions can be identified. Frequency of drug prescriptions can be identified. Other and different advantages are contemplated, in accordance with the many potential implementations.
Further implementations are contemplated, which are within the scope of the appended claims. For example, in accordance with further example implementations, the PIN window (a charting panel, for example) may include a button for initiating the taking of photographs by the camera of the platform (a camera of a tablet, for example) that hosts the conduit tool 110. In this manner, using this mechanism, the conduit tool 110, via the GUI 110, may allow the clinician to take photographs and store the photographs as part of the chart data for later viewing.
In accordance with example implementations, the conduit tool 110 may display an age and gender appropriate 3-D anatomical model for the patient.
In accordance with example implementations, the conduit tool 110 may select and provide a clinician with a candidate set of symptoms based on interaction of the clinician with a 3-D anatomical human model that is displayed by the conduit tool 110. For example, in accordance with example implementations, the clinician may touch, press or otherwise select a portion of the 3-D anatomical human model, and in response to the selection, Conduit may display a set of candidate symptoms for purposes of allowing the clinician to further document the appropriate symptoms.
In accordance with example implementations, the 3-D anatomical human model allows the clinician to select certain parts of the model and expand the parts. For example, in accordance with example implementations, by a clinician selecting a particular part of the 3-D anatomical human model using the GUI 111, the conduit tool 110 responds by displaying a more detailed and/or internal 3-D model (also herein called a “child model” herein) of the original 3-D parent model in the GUI 111. The child model may be more anatomically detailed than the parent model. As a more specific example, the clinician may select the left eye of a full body 3-D anatomical model using the GUI 111, and in response thereto, the conduit tool 110 may, via the GUI 111, display a more detailed 3-D anatomical model of the eye, allowing the clinician to further chart information that is specifically directed to the left eye.
As depicted in
As also depicted in
The clinician may select left button 902 to pull up separate menus. The clinician may select a button to add (or aid in adding) filtering for the timeline GUI. For example, the clinician may select imaging button 908 and then click on the left knee of the model 901 to filter the timeline for purposes of causing conduit to display records of left knee imaging during the selected time (from March 2010 until the present, for example). As another example, the clinician may click on the shot button 914, another left button, to pull up a procedure menu, which then allows for selection of a procedure (e.g., a flu shot); and then the clinician may click on a specific body area of the model 901 to identify (and thus, also document) which part of the body was involved in the procedure (e.g., show where a flu shot was given, for example).
In accordance with example implementations, the clinician may long press (and/or double click) a body area to pull up a more detailed and/or internal body model (the child model). This process may be repeated to allow the clinician to “drill down” deeper into a given body area with repeated long presses.
In accordance with example implementations, the model 901 may be a 3-D model that may be rotated (as can the “children”) in all axes (all three axes of an orthogonal coordinate system, for example) and resized via touch gestures.
Referring to
The virtual desk button 910 (see
In accordance with example implementations, the GUI image 1110 has a 3-D space above the timeline tool for viewing/manipulating the documents that are depicted in the timeline tool. Using the timeline tool, the clinician may pull the selected documents into the 3-D space above the timeline tool, move the documents around independently, resize the documents independently, rotate the documents independently, move the documents on top of one another, change transparencies of the documents, and so forth. The clinician may further rotate the entire 3-D field (or world) and resize the entire 3-D field as well, in accordance with example implementations.
The above-described interaction that virtual desk provides tremendous flexibility in viewing documents, especially comparing several documents at once, as can be appreciated by the clinician. The virtual desk allows for simultaneous viewing of 2-D and 3-D images on one screen (such as viewing 3-D PET scan renderings, for example).
In accordance with example implementations, one or more of the 3-D anatomical models, whether the parent 3-D anatomical model, a child model, and so forth, may be modeled using one or more publically available 3-D models, such as 3-D models available from Turbo Squid or another provider In accordance with example implementations, Conduit may use a graphics rendering engine, such as SceneKit application programming interface (API) provided by Apple Inc.; Swift provided by Apple Inc.; Unity3-D provided by Unity Technologies; Cinema4D provided by Maxon Computer, Inc.; or Cheetah3-D provided by MW3D-Solutions. Other rendering engines/3-D programming platforms may be used by Conduit, in accordance with further example implementations.
In accordance with example implementations, the conduit tool 110 resides on a computer, or processor-based machine, such as physical machine 1210 that is depicted in
The physical machine 1210 is an actual machine that is made up of actual hardware 1220 and actual machine executable instructions 1240, or “software.” In accordance with some example implementations, the conduit application, as well as possibly other applications, may be hosted by a virtual machine (which executes on physical platform provided by the physical machine 1210).
In addition to the CPU(s) 1222, the hardware 1220 may further include a display panel 1226 or a monitor to display the GUI images describe herein; and at least one input device 1224, such as a stylus, a keypad, input sensors on a display (surface acoustic wave (SAW), capacitive or resistive techniques, for example); and so forth.
The hardware 1220 further includes a non-transitory storage medium, such as a memory 1230. In this manner, the memory 1230 may be formed from non-transitory semiconductor storage devices, optical storage devices, magnetic storage devices, phase change storage devices, resistive storage devices, a combination of any of these devices, and so forth, depending on the particular implementations. As depicted in
In accordance with example implementations, the conduit tool 110 performs at least some of the techniques that are disclosed herein, such as techniques directed to displaying an anatomical model of a human on a display and using user interaction with the model to gather data representing documentation of an encounter between a healthcare provider and a patient. In addition to the conduit tool 110, the physical machine 1210 may contain other machine executable instructions 1240, such as an operating system 1260, device drivers 1262, databases 1254 used by the conduit tool 110 and possibly other applications, such as the above-described EHR interfacing application 116 (see
Although depicted in
Referring to
Other implementations are contemplated, which are within the scope of the appended claims. For example, referring to
As a more specific example,
While a limited number of examples have been disclosed herein, those skilled in the art, having the benefit of this disclosure, will appreciate numerous modifications and variations therefrom. It is intended that the appended claims cover all such modifications and variations.
Claims
1. A method comprising:
- displaying an anatomical model of a human on a display of a processor-based machine; and
- in the processor-based machine, processing input data acquired from a user interacting with the displayed anatomical model to provide or update an electronic health record in response to an encounter between the clinician and a patient, the processing comprising: selecting an interface from a plurality of interfaces based on user interaction with the displayed anatomical model as a result of the encounter; displaying the selected interface; and gathering input data in response to the user interaction with the selected interface; and determining content to be included in the electronic health record based at least in part on the input data.
2. The method of claim 1, wherein:
- displaying the anatomical model comprises displaying a skin map; and
- selecting the interface comprises: detecting selection of an area of the skin map from a plurality of areas of the skin map; and selecting the interface based at least in part on the detected selection of the area of the skin map.
3. The method of claim 2, wherein:
- selecting the interface comprises: selecting a symptom list from a plurality of symptom lists based at least in part on the detected selection of the area of the skin map; and
- displaying the selected interface comprises: displaying the selected symptom list.
4. The method of claim 3, further comprising:
- detecting selection of a symptom from a plurality of symptoms of the selected symptom list;
- selecting a condition list from a plurality of condition lists based at least in part on the selected symptom list.
5. The method of claim 1, further comprising:
- displaying a marker on the displayed anatomical model based at least in part on the human interaction with the anatomical model, the marker being associated with documentation about a chief complaint associated with a location on the model identified with the marker;
- wherein gathering the input data comprises: filtering candidate information to be included in the documentation based at least in part on a location of the marker on the anatomical model.
6. The method of claim 5, wherein the documentation comprises a subjective, objective, assessment and plan (SOAP) note.
7. The method of claim 5, wherein the filtering comprises filtering out candidate information associated with symptoms or conditions not associated with a body part identified by the human interaction with the anatomical model.
8. The method of claim 1, wherein displaying the anatomical model comprises:
- displaying a first image of at least part of a human body on the display;
- in response to the user interaction with the first image, displaying a second image of a more detailed subpart of said at least part of the human body; and
- selecting the interface based in least in part on user interaction with the second image.
9. The method of claim 1, wherein:
- displaying the anatomical model comprises displaying a human body image on a first portion of the display;
- displaying an image of table view content in a second portion of the display separate from the first portion; and
- manipulating at least one of the table view content and the human body image in response to selections made by the user to the table view content.
10. An apparatus comprising:
- a display to form an image of an anatomical model of a human;
- an input device to indicate interaction of a user with the image; and
- a processor to respond the interaction indicated by the input device to provide or update an electronic health record for a patient in response to an encounter between a clinician and the patient, the processor to: select an interface from a plurality of interfaces based on user interaction with the displayed anatomical model as a result of the encounter; display the selected interface; and gather input data in response to the user interaction with the selected interface; and determine content to be included in the electronic health record based at least in part on the input data.
11. The apparatus of claim 10, wherein the processor is adapted to:
- provide a time line graphical user interface;
- provide markers, each marker being associated with clinician documentation pertaining to a chief complaint and being associated with an area of the body associated with the chief complaint; and
- selectively display the markers based on user interaction with the time line graphical user interface.
12. The apparatus of claim 10, wherein the processor is adapted to:
- display a skin map;
- detect selection of an area of the skin map from a plurality of areas of the skin map; and
- select the interface based at least in part on the detected selection of the area of the skin map.
13. The apparatus of claim 12, wherein the processor is adapted to:
- select a symptom list from a plurality of symptom lists based at least in part on the detected selection of the area of the skin map;
- display the selected symptom list;
- detect selection of a symptom of the selected symptom list from a plurality of symptoms of the selected symptom list;
- select a condition list from a plurality of condition lists based at least in part on the detected selection of the symptom;
- display the selected condition list;
- detect selection of a condition of the selected condition list from a plurality of conditions of the selected condition list; and
- generate data documenting a chief complaint for the patient based at least in part on the selected condition.
14. The apparatus of claim 10, further comprising a handheld processing unit comprising the display, input device and processor.
15. An article comprising a non-transitory computer readable storage medium storing instructions that when executed by a computer cause the computer to:
- form an image of an anatomical model of a human on a display of the computer;
- an input device to indicate interaction of a user with the image; and
- a processor to respond the interaction indicated by the input device to gather data documenting an encounter between a clinician and a patient to provide or update an electronic health record for the patient, the processor to: select an interface from a plurality of interfaces based on user interaction with the displayed anatomical model as a result of the encounter; display the selected interface; and gather input data in response to the user interaction with the selected interface; and determine content to be included in the electronic health record based at least in part on the input data.
16. The article of claim 15, the storage medium storing instructions that when executed by the computer cause the computer to:
- provide a user interface to allow selection of a plurality of documents associated with the patient; and
- provide a virtual desk to allow viewing and image manipulation of the selected documents.
17. The article of claim 15, the storage medium storing instructions that when executed by the computer cause the computer to:
- provide a user interface to allow overlaying of at least two of the documents to allow a time lapse view of medical history.
18. The article of claim 17, wherein the at least two documents comprise image scans of the patient.
19. The article of claim 15, the storage medium storing instructions that when executed by the computer cause the computer to:
- displaying a marker on the displayed anatomical model based at least in part on the user interaction with the anatomical model, the marker being associated with documentation about a chief complaint associated with a location on the model identified with the marker; and
- filter candidate information to be included in the documentation based at least in part on a location of the marker on the anatomical model.
20. The article of claim 19, wherein the documentation comprises a subjective, objective, assessment and plan (SOAP) note.
Type: Application
Filed: Mar 6, 2015
Publication Date: Oct 1, 2015
Inventor: Mark A. Pruitt (Hood River, OR)
Application Number: 14/640,128