Customizable Extended Reality Patient Simulator and Method Thereof for Healthcare Education
An educational system and method to provide various extended reality environments that are easily adaptable to a user's educational needs. The system includes a controller, a display, one or more sensors, a feedback unit and a rapid case creation tool. The rapid case creation tool is formed on the display to provide a graphical representation of at least a portion of the executable program elements to the user for manipulation in order to create and customize logic for a particular case or related learning, training or educational experience. In one form, the rapid case creation tool includes a case data module and a case logic module cooperative with one another such that upon manipulation by the user of at least a portion of the graphical representation of the executable program elements, the system performs at least one of creation, modification and operation of an extended reality patient case.
This application claims priority to U.S. Provisional Application 62/968,417 that was filed on Jan. 31, 2020.
The present disclosure relates generally to healthcare education technologies, and more particularly to a customizable extended reality patient simulator and method for use in healthcare education and training.
BACKGROUNDVirtual reality (VR) and its variants provide immersive user simulation experiences that appear to a user to place him or her into a virtual environment that supersedes the real-world environment, thereby helping to create a suspension of disbelief and convincing the user's brain to perceive the experience as real, irrespective of the nature of the environment that is being simulated. Likewise, augmented reality (AR) provides computer-generated information that may be superimposed on a view being portrayed within a user's physical environment in order to provide contextually relevant information to the user. Furthermore, mixed reality (MR)—much like AR—presents the image information as an augmentation to the user, while additionally integrating any virtual content in a contextually meaningful way. Lastly, when VR, AR or MR are combined with sensor-based human-machine interactive systems to include one or more of closed-loop control, data-based machine learning or related software-based analytics, they may be subsumed under a larger class known as extended reality (XR) or cross reality to present to the user the fullest of the immersion experiences. Within the present disclosure, the acronym XR will be used, with the understanding that the device and methods disclosed herein may be equally applicable to allow information to flow between one or more of the particular VR, AR and MR variants, and that any distinctions or particular applicability of a particular one of which will be apparent from the context.
The use of spatial computing to take a user's physical body movements as commands or input into an interactive digital operating system such that a perceived three-dimensional physical space created by the operating system provides audio, visual, brain wave and haptic-based feedback to such user has numerous applications, particularly as a way to train a user in a particular form of immersive environment, including those for gaming, military, police and tactical, medical or other scenarios. Although these existing systems are suitable for their intended purposes, they lack the functionality to adapt a virtual environment rapidly and easily in response to particular healthcare educational user needs.
SUMMARYAccording to one aspect of the disclosure, a customizable XR patient simulator system includes a controller, numerous sensors, a display, a feedback unit and a rapid case creation tool that is formed on the display to provide a graphical representation of at least a portion of program instructions in the form of executable program elements to the user. The rapid case creation tool includes a case data module and a case logic module that are cooperative with one another such that upon manipulation by the user of one or more graphical representations of the executable program elements from each of the case logic module and the case data module, the system performs at least one of: creation of an extended reality patient case, modification of the extended reality patient case and operation of the extended reality patient case.
According to another aspect of the disclosure, a method of customizing an XR patient for use in an XR environment for healthcare education includes operating a customizable XR patient simulator system. In one form, the method includes one or more of the steps set forth in
According to another aspect of the present disclosure, a method of customizing an XR patient for use in an XR environment for healthcare education is disclosed. The method includes presenting, on a display, a rapid case creation tool comprising a graphical representation of executable program elements. Thus, upon receipt of input from a user to select and graphically manipulate the graphical representation, case logic is formed on the display. Such case logic includes selected ones of the graphical representation that can be manipulated in order to instruct a system upon which the XR environment is presented to perform one or more of creation, modification and operation of an XR patient case. In one form, the case logic is displayed as a plurality of interconnected and editable nodes on the display, where such nodes may be made up of actions, checks, effects and effect chains. Likewise, the method may include performing one or more of the steps of the previous aspect.
Reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
The present disclosure is directed to devices, systems and methods that implement a technological solution for providing various XR scenarios that are easily adaptable to a user's educational needs in order to make the user's training experience more realistic. Within the present context, such user experience includes one or more of a location, environment, patient and additional factors which add to a user's immersion and retention.
Various features and advantageous details are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known materials, processing techniques, components, and equipment are omitted so as not to unnecessarily obscure the disclosure in detail. It should be understood, however, that the detailed description and the specific examples, while indicating embodiments of the disclosure, are given by way of illustration only, and not by way of limitation. Various substitutions, modifications, additions, and/or rearrangements within the spirit and/or scope of the underlying inventive concept will become apparent to those skilled in the art from this disclosure.
Referring first to
A rapid case creation tool 145 enables on-screen creation and modification of fully dynamic configurations through various modules. In particular, an extended reality patient case (XRPC) may be accessed, built and customized by a user based upon data that is contained in a case data module 139 using case logic from a case logic module 135. With the rapid case creation tool 145, a user may quickly (typically, within minutes) create or customize the case logic for a case that describes a scenario of one or more simulated (that is to say, XR) patients 158 and that can be played immediately thereafter in an XR environment 121 or (if desired) on a web browser. As such, a customized training scenario for an XRPC may be quickly and easily created, modified or operated on. In one form, the creation or modification of an XRPC may be performed by a teacher, instructor or related educator for the purpose of training students, while the operation of the XRPC may be performed by a trainee, student or other individual whose performance is being evaluated. By presenting a customized training scenario, the system 100 determines how run-time interactions between a user and the XR patient 158 impacts at least one simulated medical condition of the XR patient 158.
Various user-defined interconnected nodes (which will be described in more detail as follows) form part of the entered case logic in order to dictate how the actions of a trainee, student or related user impact the XR patient 158 where such impact may be any action that elicits a response from—or produces an effect upon—such patient, including changes in a health condition. In one form, the information generated within the rapid case creation tool 145 may be stored in the database 150 (which may be any of the same as, cooperative with or independent of, memory 112). In one form, the system 100 is configured such that some or all of the aforementioned components cooperate to present the rapid case creation tool 145 as a Graphical User Interface (GUI, which in one form may be more simply referred to as a user interface (UI)) with visual representations of executable program elements (that is to say, program instructions, data or the like that are part of a programming language and that have been reduced to machine code or related form for operation upon by the processor 111) such that a user may graphically select and manipulate such visual representations in order to form the case. In one form, such visual representation may exist as human-readable syntax corresponding to a code command, snippet, instruction or piece of data, while in other as an icon corresponding to such command, snippet, instruction or data. By including such GUI functionality, the feedback unit 160 acts as a translator for converting the various visual representations and related strings of command (such as those that will be discussed in more detail in conjunction
Within the present disclosure, the term “case” is the description within the XRPC of a scenario that may be played out within the XR simulation on system 100. Further within the present disclosure, terms such as “case model” and “medical case training scenario” or the like are deemed to the equivalent of a case, particularly when placed within the context of a particular medical case training exercise as implemented by system 100. Within the present disclosure, the term “case logic” represents the customizable and dynamic set of rules that is stored in the case logic module 135, while the term “case data” represents various forms of data pertaining to (among others) one or more of the XR patient 158, the XR environment 121 and available actions for use by the user that is stored in the case data module 139.
Multiple XR patients 158 may be presented as part of a given case, where an object of each XR patient 158 may be thought of as a snapshot of the patient's state, as well as available user actions (such as will be discussed in more detail in conjunction with an actions sub-module 1391) that may be taken on the XP patient 158 in order to transition from one state to another. Within the present disclosure, the term “snapshot” includes various information pertaining to the XR patient 158 at a given moment in time, while the term “state” that corresponds to the particulars (such as will be discussed in more detail in conjunction with a state sub-module 1390) of the XR patient 158 contained within the snapshot. In one form, such a construct can be thought of as allowing the XR patient 158 to be modeled with sequential behavior such as that depicted by a state machine. Both the state and actions submodules 1390, 1391 will be discussed in more detail in conjunction with
Referring with particularity to
In one form of operation, the XR environment 121 is rendered and presented by the controller 110 on the XR display 120 as a game simulation (or more particularly, a medical simulation or healthcare simulation) using a conventional virtual reality operation/presentation application, such as the Unity Virtual Reality application from Unity Technologies of San Francisco, Calif. The rapid case creation tool 145 is written in a conventional software language, such as C and its object-oriented variants C++ or C#, JavaScript or other known approaches, where cooperating objects are instantiated from classes based on abstraction, encapsulation, polymorphism and inheritance. By selecting from graphical menus within the case data model 139 or one of its sib-modules, the rapid case creation tool 145 can create a customizable XR environment 121 for the associated XRPC and output the resulting logic in a form (such as a scripting language format) that is executable by the system 100 in order to implement the XR environment 121 on the XR display 120. It is to be appreciated that other conventionally available VR, AR and MR applications and their associated scripting language, as well as other software programing languages, may also be used in order to implement the rapid case creation tool 145 on system 100.
Referring with particularity to
As can be seen in
As will be discussed in further detail as follows, sensors 130 may collect sensor data from the user, and the sensor data may be provided to controller 110 to indicate that the user has interacted with the XR patient 158. For example, in various embodiments, the controller 110 may be configured to determine that the user has reached, or is reaching, for the XR patient 158 or a virtual object such as the stethoscope 147, based on the location of the user with respect to the XR patent 158 or such virtual object. In one particular implementation, the determination that the user has reached, or is reaching, for the XR patient 158 or virtual object may be based on the location of the user's virtual arm or hand 149 with respect to such patient or object. For example, in embodiments, an area of interactivity around the XR patient 158 or virtual object may be established. In other examples, the XR patient 158 or virtual object may be determined to be within a virtual area, referred to herein as a virtual target area. In some aspects, this virtual target area may refer to a three-dimensional space, area, or threshold within the XR environment 121 within which the virtual object may be located. In these embodiments, reaching the virtual object may require reaching at least the virtual target area.
In one form, the user-patient interaction between the user and the XR patient 158 within the XR environment 121 may include having the controller 110 determine that the user is reaching for or touching a particular location on the XR patient 158 by determining that the user's gaze within the XR environment 121 is placed upon the particular location of the XR patient 158. In this case, when the user is virtually looking at the XR patient 158, the controller 110 may determine that a movement of the user's hand 149 or head (not shown), in combination with the user's gaze, toward a particular area of the XR patient 158, indicates that the user may be virtually touching the XR patient 158 in the particular place. For example, the user may desire to place the virtual stethoscope 147 at a particular location on the XR patient 158 in order to detect sound differences presented at that location, as well as at other locations on the XR patient 158. In one form, the touching may be done directly through the virtual hand 149, while in another form done indirectly through the virtual stethoscope 147 or a related virtual instrument. Such other instruments may include one or more medical devices such as, and not limited thereto, a needle, a tongue depressor or a penlight or related devices for inspecting the mouth, throat, ears, eyes or the like, all of which are within the scope of the present disclosure.
As can be seen in
Referring again to
User data that may be captured by the sensors 130 includes conformation, location, movement, speed, velocity, tilt, position, force, acceleration or the like in the XR environment 121, as well as locations on the XR patient 158 based on the location, movement or position of the user in the real world. In some aspects, the sensors 130 may be configured to be placed upon the user's body, such as on one or more arms, hands, legs, torso or the like. The captured sensor data may be related to a particular action 213 needed to be tracked for determining a reaction of the XR patent 158 in the XR environment 121. For example, the sensors 130 may be configured to take measurements with respect to the user's actual hand locations, including whether the hand has moved or may be moving, the speed of the movement, the extent or range of the movement, the location of the user's hand with respect to the XR environment 121 or the like. In this manner, the measurements captured by the sensors 130 may indicate whether the user's virtual hand 149 or a virtual object such as stethoscope 147 held in the user's virtual hand 149 is contacting the XR patient 158 and if so, where it is contacting the XR patient 158 with respect to the XR environment 121. This sensor data and information may be used to determine the status of the user's interaction with the XR patient 158 with respect to a particular response according to the case logic module 135. In other aspects, the sensors 130 may be configured to capture user data without being placed upon the user. For example, the sensors 130 may be mounted external to the user, such as in the form of motion detectors, microphones or the like that are placed around the real-world environment or mounted (such as on stands, other devices or the like) around the area where the user may be expected to move.
In one form, the sensors 130 may comprise a sensor array that may be made up of similar sensors configured to capture a particular type of data. For example, the sensors 130 in the sensor array may be similarly configured to capture acceleration information. In other aspects, the sensors 130 in the sensor array may be configured to capture different types of data; in such a configuration, one sensor 130 in the sensor array may be configured to capture acceleration information, while another sensor 130 may be configured to capture location information. Likewise, another sensor 130 may be configured to capture verbal inquires and responses of the user. In one form, the sensor data that is captured by the sensors 130 may be provided to controller 110 for processing the acquired data, as well as to determine if any modifications need to be made to the XR environment 121, as previously discussed. Lastly, motors, haptic devices, microphones, speakers or the like that are responsive to the feedback unit 160 may be cooperative with (or form a part of) some of the sensors 130 in order to correlate movement or other dynamic-based ones of the interactions between the user and one or both of the XR environment 121 and the XR patient 158. In this way, the controller 110, the XR display 120, the sensors 130 and the feedback unit 160 cooperate to provide the user with the necessary audio, visual, brain-computer interface and haptic responses within the XR environment. Thus, generated feedback and the corresponding cooperation among at least these components helps to correlate any interaction between the user and the XR environment 121 and the XR patient 158 that is presented therein to the user's sensory-based immersion within such environment.
In some aspects, not all of the data that is acquired by the sensors 130 may be used to modify the XR environment 121. For example, some of the data may be used to determine whether and how the user is progressing in achieving the stated training objectives of the XRPC. This progress determination may be made by comparing sensor data related to a particular action metric. For example, sensor data associated with particular actions 213 of the user may be stored in the database 150 for a first educational session of the XRPC. During a subsequent educational session of the XRPC, sensor data associated with the particular actions 213 of the user may be collected and compared with the sensor data collected during the first educational session to determine if there has been an improvement with respect to accomplishing the stated training objectives of the XRPC. A progress report may be made available to the user, such as through I/O unit 140.
In one form, the I/O unit 140 may include a display, keyboard mouse or related device, and may be configured to display a GUI, such as the rapid case creation tool 145, structured to facilitate visual scripting-based input and output operations in accordance with aspects of the present disclosure. I/O unit 140 may be configured to accept input from one or more users, including input for the creation of, selection of, or editing of the various modules (and sub-modules) discussed herein through the rapid case creation tool 145 that in one form may be saved to and retrieved from the database 150. In one form, the I/O unit 140 may be configured to provide output which may present, display or reproduce the XR patient 158 within the XR environment 121. In these cases, an instructor may be able to monitor what the user is perceiving in the XR environment 121.
In one form, the database 150 (a common example of which is the open-source native multi-model database system developed by ArangoDB GmbH) may use a JSON-based storage format to facilitate storage operations. In one form, the database 150 may be running as a case application programming interface (API, that is to say, “app”) server in which to provide the JSON file for the associated case data over a network. In addition to data and logic associated with the XRPC and its associated case logic and data modules 135, 139, the database 150 may be configured to store previously measured sensor data, user actions 213, user profile information or the like. In some aspects, the database 150 maybe integrated into the memory 112, or may be provided as a separate module. In yet other aspects, the database 150 may be a single database, or may be a distributed database implemented over a plurality of database modules. Relatedly, the database 150 may be configured to store information for a plurality of XR patients 158, as well as for one or more uniquely-identifiable users and various training and learning operations, exercises, scenarios or the like.
Feedback unit 160 may be communicatively coupled to the controller 110 to receive a feedback signal therefrom based on a virtual action 213 being performed in the XR environment 121. In this manner, feedback unit 160 provides a real-world response, such as sounds, vocal responses and tactical responses in order to assist the user in the performance of the XRPC. The following sections describe the data model use for representing the XRPC in the game simulation.
Referring next to
Referring with particularity to
Referring with particularity to
Referring with particularity to
When the XRPC is loaded and run, the associated data from the case data module 139 (as well as its sub-modules 1390 and 1391) is placed into a JSON file and transferred to the controller 110 for implementation. It is to be appreciated that although the case data module 139 and code implementation acting thereon may adhere to a JSON format, any other format may be used in order to serve a particular implementation (such as an extensible markup language (XML) format, a hypertext markup language (HTML) format or the like). One manner in which the JSON or related data interchange format may be transferred is through a web service, such that a public or private network entity may employ cloud-based computing or storage that is accessible through the internet or a related networks to a single client or a distributed set of clients. Such a web service may be made up of numerous data centers that in turn may include one or more actual or virtualized computer servers, storage devices, networking equipment or the like in order to implement and distribute the offered web services.
Referring with particularity to
Within the present context, the following naming formats are generally employed. The [name] is the variable or field name, whereas the right side of the colon is the data type (sometimes an individual type and sometimes a list or array of a type). The formula then may follow one of two general forms as follows: fieldName: FieldTypeName; and fieldName: Array<FieldTypeName>. When the data is just a word in all caps, then those are enum values, such as FORMATIVE and SUMMATIVE. One exemplary form could be for names of the XR patient 158, such as PatientNames{AMY, BEN, HAWTHORN, MILITARY_AMY, MILITARY_BEN, MILITARY_HAWTHORN} or Environments{DISPATCH, AMBULANCE_TYPE III, AMBULANCE_TYPE II, HOTEL_BATHROOM, BEDROOM, CLINIC, DINING_ROOM, ARCADE, EMERGENCY_ROOM, HOSPITAL, LIVING_ROOM, ALLEY_DAY, ALLEY_NIGHT, LAUNDROMAT, HOTEL_ROOM, SUBWAY_PLATFORM, SUBWAY_TRAIN, CITY_PARK_DAY, CITY_PARK_NIGHT, GYMNASIUM, STREET_CORNER, STREET_CORNER_NIGHT, URBAN_RIVERBED_DAY, URBAN_RIVERBED_NIGHT, PUBLIC_POOL, MILITARY_POOL, MILITARY_TENT, POOL_HAL}. Certain entries, such as the patient effect history 1390O (shown at the bottom of
Referring next to
Referring with particularity to
Referring with particularity to
Within the present context, an effect node 1371C only changes one thing, whereas an effect chain node 1371D acts as a wrapper that has a whole subset of nodes within it, potentially changing many things. A blown up version of the internals of a representative effect chain node 1371D can be seen in
In operation, the action node 1371A defines the user actions (such as those depicted in the actions sub-module 1391 of
As shown, the data nodes 1371 may be visually interconnected by a simple selection (such as through a mouse click) to present the interconnection line 1373 for actions that may involve two or more objects and that the user may drag to a desired interconnection point 1375. The case logic may be saved to the database 150 via simple save button 1377 as well as having an interconnection undone by a previous button 1379. It is to be appreciated that the data nodes 1371 when operatively joined via interconnections 1373 in the manner described provide a simple, easily extendable and highly dynamic representation of the XRPC and the associated case logic module 135. Significantly, this modification and customization may be made to take place without the need for a user or instructor to perform the more laborious task of performing modifications through the computer programming language that underlies and implements the rapid case creation tool 145 and the XR environment 121. As shown, the student, instructor or other interested user simply selects (such as through a mouse, keyboard or other I/O unit 140) which ACE system parameters are to take place through the placement of such data nodes 1371 in the logic design window 1370, along with the subsequent interconnection line 1373 placement, all as previously described. From there, the user may define one or more effects for any interconnected action via drop-down box selection of effects node 1371C, as well as timing for such an effect via an associated timer node 1371E. As such, the interconnection lines 1373 promote customization of portions of the case logic (such as the ACE portion) in that when placed between a pair of the data nodes 1371, the interconnection lines 1373 are providing a so-called logic bridge between the connected nodes 1371 to establish a truth function, conditional operation or related logical connection therebetween. In a similar way, performing a check with an action, as well as what effect chain should be followed as a result of the check, may be quickly set up.
An example case logic in general (and check node branching logic as shown with particularity in
Referring next to
As discussed elsewhere, the timer node 1371E may be made to begin on the start of a case, or as a triggered effect node 1371C in the case logic module 135. The timer node 1371E serves also as a continuation point for the case logic module 135. In the rapid case creation tool 145, if a displayed timer node 1371E does not have a connection (interconnection 1373) from the left side, it is an “initial timer”. These will start counting down when the game becomes active. If the node does have a connection from the left side, it is an “effect timer”. These will start counting down when the previous node output triggers into the timer. Effects with an effectCompleteDuration parameter 201B that is greater than zero, an expireDuration parameter 201A that is greater than zero, or an absoluteVitalDelay parameter (not shown) that is greater than zero can also create timers. It is to be appreciated that placement of the data nodes 1371 within the logic design window 1370 of
Referring next to
Referring with particularity to
The actions group 330 may include a UI 330A for access to settings, controllers, voice commands and the simulated tools (such as the previously-discussed stethoscope, penlight or the like); all of these may be used to feed into the user action 330B (which in one form may be used to control the previously-discussed actions 1371A of
Referring with particularity to
Once the training exercise has been set up, steps associated with the use of the started training exercise are followed. As a threshold matter, if at any point during the exercise the timer reaches zero or the XR patient 158 is deemed to have “died”, a fail event is called, which has the effect of disabling any timers and any further input. All action and completed events are sent across the network through the network manager 310B and put in queue in a network instigator buffer (not shown) so they can be processed in the appropriate order. Any timestamp differences due to lag are reconciled and passed throughout case logic module 135 until they are fully accounted for. First, in step 405, a student interacts with a menu in the UI; this has the effect of having the student interact with a virtual object in XR environment 121 while the case logic from the case logic module 135 of another user (such as an instructor) includes an initial timer (that is to say, a timer node 1371E but that is not related to any predecessor nodes) that is set to zero and that has the effect of doing the same to the student user's timer. Next in step 406, the action or timer nodes 1371A, 1371E create a network event in the network manager 310B and calls the function NetworkRequestPatientAction (CustomEventCode, patientIndex, object[ ]) to request that an action be taken via the action node 1371A, while the function NetworkLogicTimerComplete (historyOutputTime, key) is taken via the timer node 1371E. The timestamp for any action, as well as any of a complete event time from timer node 1371E, are compared within the network manager 310B upon receipt, and if a timestamp is prior to the last received timestamp, that timestamp is pushed forward to the last timestamp received plus 1 millisecond as part of a type punning approach to ensure that the chronology matches the receipt order in order to be consistent with the programming language's type system. Each timer node 1371E has a key such that the data is created and stored locally and is passed from all users, although only the first one received will be used to unlock and retrieve its local case logic data. In essence, the code may handle multiple different clients handling a timer event in a reliable and decentralized fashion. In this way, each client will store the same data and send the exact same message out over the network, even though the certainty of when a particular client's message is received might not be initially ascertainable; in this way, the system 100 may handle the first message and ignore the rest. The reason its doing all of that over the network instead of just processing it locally on each client is so that the timer event is in proper chronological order with whatever user actions might be taking place over the network. In such case, the key needs to be consistently unique across all of the users. The key format is: TTTTTTTCCCEEEELLLL (one digit unused, followed by a seven digit timestamp “T”, a three digit timestamp count “C” and an eight digit node identification where the first four are the effect chain's internal node's identification “E” and the second four are the case logic's node identification “L”). Next, in step 407, the network event is received by the user and put into the network instigator buffer so it can be processed in the appropriate order by the case logic 340A from the core manager group 340. Within the present context, the case logic 340 of
-
- ProcessAssessmentAction(patient, . . . );
- ProcessProcedureAction(patient, . . . );
- ProcessOxygenAction(patient, . . . );
- ProcessLeadsAction(patient, . . . );
- ProcessPacingAction(patient, . . . );
- ProcessQuestionAction(patient, . . . );
- ProcessMedicationAction(patient, . . . );
- ProcessLabAssetOrdered(patient, . . . );
- ProcessLabAssetViewed(patient, . . . );
- ProcessLabResultOrdered(patient, . . . );
- ProcessLabResultViewed(patient, . . . );
- ProcessCompressionAction(patient, . . . );
- ProcessIvAction(patient, . . . ); and
- ProcessVitalAction(patient, . . . ).
The foregoing represent specific code functions that would be called when a user's input is identified. For example, if a user clicks the menu to insert an IV, ProcessIvAction(patient, . . . ) would be called in order to pass the corresponding input as parameters. The case logic 340A process is then run with the input compared to the case logic data to determine what would change in the simulation. It is understood that each of the functions correspond to the list of actions previously identified in conjunction with
The actions, checks, effects and effect chains that are associated with the chosen action are passed into the logic list 320G (which in one form may be embodied by logic list 135N of the case logic module of
Once the training exercise has been run, steps associated with ending a training exercise for the XR patient 158 are performed. In step 415, when the training session wraps up, the XR patient (or patients) 158 and menus disappear from the user's XR display 120. At this time, all student actions are displayed in a debriefing menu (not shown) with the appropriate messaging (such as “win”, “fail” or something comparable). In step 416, the student may click a button (such as a Return To Dispatch Exit Case button, not shown); this has the effect of teleporting the student or students back to the dispatch area, as well as unloading any extra scenes. At his time, any remaining case data is reset and the process may be repeated for a next case. It will be appreciated that although for the sake of brevity only certain exemplary function calls are shown, the present disclosure is not so limited. As such, the various function calls that correspond to any or all of the actions, check, effects or the like are understood to be within the scope of the present disclosure.
It is noted that, in some implementations, the XR environment 121 may be available in situations in which disparate users may participate. For example, a particular XRPC may allow two or more users to participate in a multiplayer mode. In addition, modifications of the case logic module 135 for a particular XRPC may be made such that a skill level of the user(s) is taken into account. Thus, where a user is inexperienced or is having a difficult time with an XRPC, the case logic module 135 may be customized accordingly to ensure the user is progressing in their learning of the stated objectives for the reduced-complexity version of the XRPC. As will be appreciated, a system implemented in this way can provide consistent, objective grading in an industry that is currently largely subjectively graded across different skill levels.
In one form, details associated with the XR patient 158 within a particular case may correspond to activities and setups for beds, IVs, clothing, moulage (or related mock injuries or maladies), a three-dimensional patient mesh (that is to say, the representation of the XR patient 158), conditions, states, vitals, monitors, authoring tool, UI, voice commands, stethoscope, penlight and animator mechanism. Although not shown by arrows, it is understood that connectivity between the various blocks depicted as patient manager group 320, actions manager group 330 and core manager group 340 may be present in order to have moulage, sounds, clothing or the like, as these things are (or can be) customizable.
Referring next to
Those skilled in the related art would further appreciate that the various illustrative logical blocks, modules, circuits and algorithm steps described in connection with the disclosure may be implemented as electronic hardware, computer software or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits and steps have been described generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure. Skilled artisans will also readily recognize that the order or combination of components, methods or interactions that are described herein are merely examples and that the components, methods or interactions of the various aspects of the present disclosure may be combined or performed in ways other than those illustrated and described herein.
Functional blocks and modules in the drawings may comprise processors, electronics devices, hardware devices, electronics components, logical circuits, memories, software codes, firmware codes or any combination thereof. Consistent with the foregoing, various illustrative logical blocks, modules, and circuits described in connection with the disclosure herein may be implemented or performed with the processor 111 (as depicted in
An exemplary form of the memory 112 is either coupled to or integral with the processor 111 such that the processor 111 can read information from, and write information to, the memory 112, as well as operate upon instructions that, when executed by the processor 111, cause the processor 111 to perform operations, such as part of the aforementioned ASIC which in turn may reside in a user terminal, base station, sensor or any other communication device. In the alternative, the processor 111 and the memory 112 may reside as discrete components in a user terminal. As discussed elsewhere, the memory 112 may be either cooperative with or part of the database 150.
In situations where the method or algorithm is at least partially embodied in a software module, the module may reside in memory 112 or related computer-readable media and that can exist in the form of random access memory (RAM), flash memory, read-only memory (ROM), EPROM memory, EEPROM memory, registers, hard disk, removable disk, CD-ROM, solid state drives (SSDs), non-transitory computer readable medium or any other form of storage medium configured to store data in a persistent or non-persistent state as known in the art. Regardless of its form, the memory 112 may be configured to store program instructions, including parts thereof in the form of machine code or related executable program elements that implement the desired steps of a method or algorithm consistent with a given case. Within the present disclosure, the machine code forms one or more pieces of program structure that may be arranged as a set or related ordered sequence (such as depicted graphically in
The combination of structural and functional features of various components and corresponding case software in order to produce a relationship necessary to constrain implementation of the system 100 is described in more detail as follows. In one form, the software provides functional attributes in the form of instructions to the system 100 such that the structural attributes that are provided by the hardware of the processor 111—which is preconfigured to interpret executable forms of such instructions by virtue of its particular ISA—impart specific meaning to such instructions. As will be understood, this ISA is responsible for organization of memory and registers, data structures and types, what operations are to be specified, modes of addressing instructions and data items, as well as instruction encoding and formatting. Thus, the ISA acts as an interface between the purely structural attributes of the processor 111 and the functional attributes of the system or application software through the implementation of ISA-specific machine code. It is this interrelationship that constrains the way in which the processor 111 is controlled in order to achieve the desired functionality. In one form, the software includes application software and system software where the former acts as an interface between the user and the latter, while the latter acts as an interface between the former and the computer hardware of system 100.
More particularly, the interrelationship between the system software and the hardware is established by virtue of a native instruction set that in turn is made up of an executable form of the system software under the particular ISA of the processor 111 and ancillary components within system 100. This platform-specific native instruction set includes executable program element portions that make up machine code or machine code sets that in turn allows the processor 111 to become particularly configured to perform a predefined set of operations in response to receiving a corresponding instruction from a separate set of machine codes, such as those associated with the application software and that are configured effect the logic of a particular XRPC.
In a generally similar way, the application software that becomes a corresponding piece of machine code is predefined to perform a specific task; one or more such pieces may be arranged as a larger machine code set in order to achieve the functionality set forth in one or more steps that are associated with a particular case. In this way, source code created by a programmer (such as that corresponding to the data, states, actions, effects and other logic of
Significantly, the machine code, native instruction set and other portions of executable program instructions are understood as a physical manifestation of their underlying logic or data and as such become structural elements within the system 100 in much that same way as the processor 111, memory 112 and other components (such as those depicted in
In one form, the I/O 140 may be configured to coordinate I/O traffic between processor 111, memory 112 and any peripherals in the system 100, and may include performing any necessary protocol, timing or other data transformations to convert data signals from one component (such as memory 112) into a format suitable for use by another component (such as the processor 111), as well as support for devices attached through various types of peripheral buses, such as the Universal Serial Bus (USB) or the Peripheral Component Interconnect (PCI) bus standard. In one form, some or all of the functionality of I/O 140 may be incorporated directly into processor 111. In one form, the I/O 140 may be configured to allow data to be exchanged between the system 100 and other device or devices attached to a network or networks. In one form, the I/O 140 may support communication via any suitable wired or wireless general data networks, such as types of Ethernet networks, wireless networks or the like.
Within the present disclosure, the terms used to identify the various modules (such as the case logic module 135 and the case data module 139) all recite (in the form of compound nouns) self-sufficient pieces of structure that are identified by the function they perform. In this way, these term to have a sufficiently definite meaning as the name for the module as structure is identified within the context of the corresponding function. In this way, the elements that make up these modules—regardless of being in the form of various computer software, firmware and hardware features—are described structurally to provide a tangible, definite interface between the user, the software and the computer hardware of the system 100 as a way to provide the functionality as discussed herein.
It is noted that the terms “substantially” and “about” may be utilized herein to represent the inherent degree of uncertainty that may be attributed to any quantitative comparison, value, measurement, or other representation. These terms are also utilized herein to represent the degree by which a quantitative representation may vary from a stated reference without resulting in a change in the basic function of the subject matter at issue.
Within the present disclosure, the use of the prepositional phrase “at least one of” is deemed to be an open-ended expression that has both conjunctive and disjunctive attributes. For example, a claim that states “at least one of A, B and C” (where A, B and C are definite or indefinite articles that are the referents of the prepositional phrase) means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together. By way of example within the present context, if a claim recites that the wound irrigation treatment system may selectively adjust at least one of an amount of irrigation fluid and an amount of excess fluid, and if such adjustment is the addition or removal of one or both of the irrigation and excess fluids, then such data acquisition satisfies the claim.
Within the present disclosure, the following claims are not written in means-plus-function format and are not intended to be interpreted based on 35 USC 112(f) unless and until such claim limitations expressly use the phrase “means for” followed by a statement of function void of further structure.
While particular embodiments have been illustrated and described herein, it should be understood that various other changes and modifications may be made without departing from the spirit and scope of the claimed subject matter. Moreover, although various aspects of the claimed subject matter have been described herein, such aspects need not be utilized in combination. It is therefore intended that the appended claims cover all such changes and modifications that are within the scope of the claimed subject matter.
Claims
1. A customizable extended reality patient simulator system comprising:
- a controller configured to operate upon executable program elements that are in the form of case-specific information pertaining to an extended reality environment and case-specific interactions between a user of the system and an extended reality patient;
- a display signally cooperative with the controller and configured to depict to the user the extended reality environment with the extended reality patient situated therein;
- a plurality of sensors signally cooperative with the controller;
- a feedback unit signally cooperative with the controller, the display and the plurality of sensors to present to the user a sensory-based immersion within the extended reality environment; and
- a rapid case creation tool formed on the display to provide a graphical representation of at least a portion of the executable program elements to the user, the rapid case creation tool comprising: a case data module; and a case logic module cooperative with the case data module such that upon manipulation by the user of at least a portion of the graphical representation of the executable program elements from each of the case logic module and the case data module, the system performs at least one of: creation of an extended reality patient case; modification of the extended reality patient case; and operation of the extended reality patient case.
2. The customizable extended reality patient simulator system of claim 1, wherein the controller comprises at least one processor and memory cooperative with one another to respectively operate upon and store the machine code.
3. The customizable extended reality patient simulator system of claim 1, wherein the graphical representation comprises a logic design window configured to display editable case nodes that are selected from the case logic module.
4. The customizable extended reality patient simulator system of claim 3, wherein the editable case nodes comprise actions, checks, effects and effect chains.
5. The customizable extended reality patient simulator system of claim 4, wherein the editable case nodes further comprise a timer node.
6. The customizable extended reality patient simulator system of claim 4, wherein the actions comprise at least one of conducting an assessment, administering medication, placing an intravenous line, placing at least one lead, performing a compression, checking vital signs, reviewing orders and reviewing laboratory results.
7. The customizable extended reality patient simulator system of claim 4, wherein the logic design window is further configured to display interconnection lines that upon placement between a pair of the editable case nodes establishes a logical connection therebetween.
8. The customizable extended reality patient simulator system of claim 1, wherein the graphical representation comprises a plurality of sub-modules of the case data module, the sub-modules comprising a state sub-module and an actions sub-module.
9. The customizable extended reality patient simulator system of claim 1, further comprising an overlay that upon use superimposes a visual grid pattern on the extended reality patient within the extended reality environment.
10. A method of operating a customizable extended reality patient simulator system, the method comprising:
- retrieving case data for an extended reality patient from a database;
- converting the case data into data objects containing values corresponding to at least one of initial vitals, initial states, initial conditions, environments, placements, patient and case logic that includes effect chains;
- upon user interaction with a menu within a user interface to ascertain a virtual object in the extended reality environment while the case logic from a case logic module, having an initial timer commence counting;
- having at least one of an action or timer event create a network event in a network manager such that it calls at least one function;
- presenting the network event to the user;
- determining if there is an identifiable action such that if so, the case logic is called into the appropriate processing function for that action type with the appropriate action type objects;
- evaluating at least one check and adding any resulting effect and effect chain to a logic list;
- processing the logic list effect chain by calling a function;
- calling a function and scanning it for an end effect such that if found, a “win” event is called such that any timers and additional input is disabled, whereas if it is not found, the effects are sent so that any triggered effect will have a timer created with the effect's resulting checks and effects;
- passing the effect for the case along with its corresponding component, along with the patient data; and
- updating event listeners corresponding to updates ones of the data objects of respective vitals, conditions and states.
11. The method of claim 10, wherein the presenting the network event to the user comprises putting the presented into the network instigator buffer so it can be processed in order by the case logic from the core manager group.
12. The method of claim 10, wherein if the timer is being used, a case logic manager function configured to handle a Handle TimerComplete function is called;
13. The method of claim 10, wherein evaluating at least one check and adding any resulting effect and effect chain to a logic list is performed through CaseLogicManager, ProcessChecks(patient, patient.logicList) function.
14. The method of claim 10, wherein the function called for processing the logic list effect comprises a ProcessEffectChains(patient) function that is contained in a logic case manager that forms part of the case logic module.
15. The method of claim 14, wherein internal connections of the effect chain are added to a standalone logic list and set to a Boolean “true” value.
16. The method of claim 10, wherein the function called and scanned for the end effect comprises ProcessEffects(patient, patient.logicList) function.
17. The method of claim 10, wherein passing the effect for the case and its corresponding component comprises at least one of a PatientVitalsManager function for vitals and a PatientAnimationManager function for animations.
Type: Application
Filed: Jan 29, 2021
Publication Date: Feb 16, 2023
Applicant: Ves, LLC (Wilmington, OH)
Inventors: Steven Karl Westhoff (Wilmington, OH), Nathanael Alan Anderson (Wilmington, OH), Serigne Saalihou Mbacké Ndiaye (Dakar)
Application Number: 17/796,798