METHOD AND SYSTEM FOR ANALYSING A VIRTUAL REHABILITATION ACTIVITY/EXERCISE

A method for analysing one of a rehabilitation activity and a performance of a user during a rehabilitation exercise, comprising: receiving one of a rehabilitation activity and executed movements performed by the user during the rehabilitation exercise, the rehabilitation activity defining an interactive environment to be used for generating a simulation that corresponds to the rehabilitation exercise; determining movement rules corresponding to the one of the rehabilitation activity and the rehabilitation exercise; determining a sequence of movement events corresponding to the one of the rehabilitation activity and the rehabilitation exercise, each one of the movement events corresponding to a given state of the property of the virtual user-controlled object in the interactive environment, the given state corresponding to one of a beginning and an end of a movement; and determining a movement sequence of elementary movements using the movement rules and the movement events.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority of U.S. Provisional Patent Application having Ser. No. 61/783,504, which was filed on Mar. 14, 2013, and is entitled “Method and system for creating and analysing a virtual rehabilitation activity”, the specification of which is hereby incorporated by reference.

TECHNICAL FIELD

The present invention relates to the field of physical rehabilitation for patients, and more particularly to virtual rehabilitation systems and methods.

BACKGROUND

There currently exist a number of software-based solutions to support physical rehabilitation. They include various activities and games that harness motion tracking systems such as the Nintendo Wii™ and Microsoft Kinect™. In certain cases, they allow clinicians to customize parameters that have a direct effect on the interactive environment. For example, by means of the software, clinicians may be able to control the location at which virtual objects are positioned in the interactive environment, or the speed at which they move. However, as clinicians are not accustomed to controlling interactive environments, they experience an intuitive disconnect between the interactive controls they have available to them, and the clinical objectives they wish to accomplish with their rehabilitation activity program. This hampers the clinicians' ability to use the software tools with speed and effectiveness, and undermines the potential value to clinicians of any software-based rehabilitation system

Therefore, there is a need for an improved method and system for analysing rehabilitation activities.

SUMMARY

In accordance with a first broad aspect, there is provided a computer-implemented method for analysing one of a rehabilitation activity and a performance of a user during a virtual rehabilitation exercise, comprising: receiving one of a rehabilitation activity and executed movements performed by the user during the virtual rehabilitation exercise, the rehabilitation activity defining an interactive environment to be used for generating a simulation that corresponds to the virtual rehabilitation exercise, the rehabilitation activity comprising at least one virtual user-controlled element and input parameters; determining movement rules corresponding to the one of the rehabilitation activity and the rehabilitation exercise; each one of the movement rules comprising a correlation between a given group consisting of at least a property of the virtual user-controlled element and a body part, and at least one of a respective elementary movement and a respective task-oriented movement; determining a sequence of movement events corresponding to the one of the rehabilitation activity and the executed movements, each one of the movement events corresponding to a given state of the property of the virtual user-controlled object in the interactive environment, the given state corresponding to one of a beginning and an end of a movement; determining a movement sequence comprising at least one of elementary movements and a task-oriented movement using the movement rules and the movement events; and outputting the movement sequence.

In one embodiment, the step of receiving comprises receiving the rehabilitation activity.

In one embodiment, the step of determining movement rules comprises: determining a rehabilitation scenario that corresponds to the received rehabilitation activity; and retrieving from a database the movement rules that correspond to the determined rehabilitation scenario.

In one embodiment, the step of determining a sequence of movement events comprises determining the movement events from at least one of the input parameters.

In another embodiment, the step of determining a sequence of movement events comprises retrieving predefined movement events from a storing unit.

In one embodiment, the step of determining a movement sequence comprising at least one of elementary movements and a task-oriented movement comprises: determining movement segments from the movement events; assigning a respective one of the movement rules to each one of the movement segments; and assigning at least one of the respective elementary movement and the respective task-oriented movement contained in the assigned movement rule to the each one of the movement segments.

In one embodiment, the method further comprises a step of determining and outputting at least one clinical objective corresponding to the received rehabilitation activity.

In one embodiment, the step of determining the at least one clinical objective comprises: comparing a given one of the input parameters to a challenge threshold; and when the given one of the input parameters is greater than the challenge threshold, identifying a movement characteristic related to the given one of the input parameters as being the at least one clinical objective.

In one embodiment, the method further comprises a step of generating and outputting an alert.

In one embodiment, the step of generating the alert comprises: comparing a given one of the input parameters to a danger threshold; and when the given one of the input parameters is greater than the danger threshold, identifying a potential danger for the patient.

In another embodiment, the step of receiving comprises receiving the executed movements performed by a user during a rehabilitation exercise.

In one embodiment, the step of determining movement rules comprises retrieving movement rules corresponding to the rehabilitation exercise.

In one embodiment, the step of determining a sequence of movement events comprises retrieving a sequence of ordered movement events corresponding to the rehabilitation exercise.

In another embodiment, the step of determining a sequence of movement events comprises receiving unordered movement event triggers corresponding to the rehabilitation exercise, and ordering the unordered movement event triggers, thereby obtaining ordered movement events.

In one embodiment, the step of determining a movement sequence comprising at least one of elementary movements and a task-oriented movement comprises: determining movement segments from the movement events; assigning a respective one of the movement rules to each one of the movement segments; and assigning at least one of the respective elementary movement and the respective task-oriented movement contained in the assigned movement rule to the each one of the movement segments.

In accordance with a second broad aspect, there is provided a system for analysing one of a rehabilitation activity and a performance of a user during a virtual rehabilitation exercise during which the user performs executed movements, the rehabilitation activity defining an interactive environment to be used for generating a simulation that corresponds to the virtual rehabilitation exercise, the rehabilitation activity comprising at least one virtual user-controlled element and input parameters, the system comprising: a movement rules determining module for determining movement rules corresponding to one of the rehabilitation activity and the rehabilitation exercise; each one of the movement rules comprising a correlation between a given group consisting of a property of the virtual user-controlled element and a body part, and at least one of a respective elementary movement and a respective task-oriented movement; a movement events determining module for determining a sequence of movement events corresponding to one of the rehabilitation activity and the executed movements, each one of the movement events corresponding to a given state of the property of the virtual user-controlled object in the interactive environment, the given state corresponding to one of a beginning and an end of a movement; and an elementary movement determining module for determining and outputting a movement sequence comprising at least one of elementary movements and a task-oriented movement using the movement rules and the movement events and outputting the sequence of elementary movements.

In one embodiment, the movement rules determining module is adapted to receive the rehabilitation activity.

In one embodiment, the movement rules determining module is adapted to determine a rehabilitation scenario that corresponds to the received rehabilitation activity, and retrieve from a database the movement rules that correspond to the determined rehabilitation scenario.

In one embodiment, the movement rules determining module is adapted to determine the movement events from at least one of the input parameters.

In another embodiment, the movement events determining module is adapted to retrieve predefined movement events from a storing unit.

In one embodiment, the elementary movement determining module is adapted to: determine movement segments from the movement events; assign a respective one of the movement rules to each one of the movement segments; and assign at least one of the respective elementary movement and the respective task-oriented movement contained in the assigned movement rule to the each one of the movement segments.

In one embodiment, the system further comprises a clinical objective module for determining and outputting at least one clinical objective corresponding to the received rehabilitation activity.

In one embodiment, the clinical objective module is adapted to: compare a given one of the input parameters to a challenge threshold; and when the given one of the input parameters is greater than the challenge threshold, identify a movement characteristic related to the given one of the input parameters as being the at least one clinical objective.

In one embodiment, the system further comprises an alert module for generating and outputting an alert.

In one embodiment, the alert module is adapted to: compare a given one of the input parameters to a danger threshold; when the given one of the input parameters is greater than the danger threshold, identify a potential danger for the patient.

In another embodiment, the movement rules determining module is adapted to receive the executed movements performed by a user during a rehabilitation exercise.

In one embodiment, the movement rules determining module is adapted to retrieve movement rules corresponding to the rehabilitation exercise.

In one embodiment, the movement events determining module is adapted to retrieve a sequence of ordered movement events corresponding to the rehabilitation exercise.

In another embodiment, the movement events determining module is adapted to receive unordered movement event triggers corresponding to the rehabilitation exercise, and order the unordered movement event triggers in order to obtain ordered movement events.

In one embodiment, the elementary movement determining module is adapted to: determine movement segments from the movement events; assign a respective one of the movement rules to each one of the movement segments; and assign at least one of the respective elementary movement and the respective task-oriented movement contained in the assigned movement rule to the each one of the movement segments.

In one embodiment, the given group further comprises a direction of change for the property.

In accordance with another broad aspect, there is provided a computer-implemented method for creating a rehabilitation activity, comprising: receiving clinical objectives; determining a rehabilitation scenario and corresponding movement rules adapted to the clinical objectives; determining movement events using the movement rules; determining customizable parameters for the rehabilitation scenario, thereby obtaining the rehabilitation activity; and outputting the rehabilitation activity.

In accordance with a further broad aspect, there is provided a system for creating a rehabilitation activity, comprising: a scenario determining module for receiving clinical objectives and determining a rehabilitation scenario and corresponding movement rules adapted to the clinical objectives; a movement events determining module for determining movement events using the movement rules; and a scenario parameter determining module for determining customizable parameters for the rehabilitation scenario in order to obtain the rehabilitation activity, and output the rehabilitation activity.

Features and advantages of the subject matter hereof will become more apparent in light of the following detailed description of selected embodiments, as illustrated in the accompanying figures. As will be realized, the subject matter disclosed and claimed is capable of modifications in various respects, all without departing from the scope of the claims. Accordingly, the drawings and the description are to be regarded as illustrative in nature, and not as restrictive and the full scope of the subject matter is set forth in the claims.

BRIEF DESCRIPTION OF THE DRAWINGS

Further features and advantages of the present invention will become apparent from the following detailed description, taken in combination with the appended drawings, in which:

FIG. 1 is a flow chart of a method for analysing a rehabilitation activity, in accordance with an embodiment;

FIG. 2 illustrates right-hand a Cartesian coordinate system, in accordance with the prior art;

FIG. 3 is a block diagram of a system for analysing a rehabilitation activity, in accordance with an embodiment;

FIG. 4 is a flow chart of a method for determining the performance of a patient during a virtual rehabilitation exercise, in accordance with an embodiment; and

FIG. 5 is a flow chart of a method for generating a rehabilitation activity, in accordance with an embodiment.

It will be noted that throughout the appended drawings, like features are identified by like reference numerals.

DETAILED DESCRIPTION

The present methods and systems are described in the context of virtual rehabilitation. A virtual rehabilitation system is adapted to display an interactive simulation with which a patient may interact to execute a rehabilitation exercise. The virtual rehabilitation system comprises at least a simulation generator for generating the interactive simulation, a display unit for displaying the simulation to the patient, and a movement tracking unit for tracking a movement of at least a body part of the patient. For example, a given element of the rehabilitation simulation may be controlled by a given body part of the patient. While the simulation is displayed on the display unit to the patient, the patient executes a rehabilitation exercise during which he moves his given body part. The movement tracking unit tracks the movement of the given body part and transmits the position of the given body part to the simulation generator. The simulation generator modifies a characteristic/property of the given element of the simulation in substantially real time as a function of the new position of the given body part received from the movement tracking unit, thereby rendering the simulation interactive. For example, the position of the given element may be changed in the simulation according to the change of position of the tracked body part.

Usually, a medical professional such as a therapist or a clinician creates the virtual rehabilitation exercise that is adapted to the needs and/or condition of a patient. In one embodiment, the medical professional is presented with a user interface for creating the virtual rehabilitation exercise. For example, the medical professional may access a database of rehabilitation exercises through the interface, and selects a given rehabilitation exercise that is adequate to the patient. The medical professional may further specify some parameters for the rehabilitation exercise.

The present description presents a method and system for analysing a rehabilitation exercise created by a medical professional in order to provide the medical professional with a list of movements to be executed by the patient according to the created rehabilitation exercise. The medical professional may then verify whether the rehabilitation exercise that he has created/selected is adequate for the patient. The method may also be used to analyse the movements performed by the patient during the rehabilitation exercise. The present description further presents a method and system for creating a virtual rehabilitation exercise.

In the context of the present description, a rehabilitation activity refers to an interactive environment to be used for generating a simulation that corresponds to a virtual rehabilitation exercise. In other words, a rehabilitation activity comprises all required information to create a virtual rehabilitation exercise. A rehabilitation scenario refers to an incomplete interactive environment, and comprises at least one virtual user-controlled element which is mapped to at least one body part to be exercised, and at least one virtual reference element of which a characteristic is not specified. The rehabilitation scenario may further comprise a background scene. In one embodiment, a rehabilitation scenario comprises a background scene and a set of one or more virtual elements with properties/parameters that may be configured by the medical professional. A property or parameter refers to any property of the background scene and the virtual elements, such as the color, size or position of an object, and/or any state or change in the interactive environment. A rehabilitation scenario may be seen as an incomplete rehabilitation activity of which some properties/parameters such as a property or characteristic of the virtual user-controlled element have not been specified. The unspecified properties/parameters may help to define the type of involved movement required, the movement characteristic (quality of movement) required, and/or the like.

Each rehabilitation scenario covers at least one corresponding general rehabilitation focus. Examples of general focuses comprise executive functioning, bi-lateral coordination, posture, balance, and the like.

For example, certain scenarios may relate to a general focus around bi-lateral shoulder/elbow movement or head movement, while other scenarios may relate to a general focus around trunk movement or hip movement. In one embodiment, a rehabilitation scenario further comprises its corresponding general focus(es). For example, the general focus may be included in the metadata of the rehabilitation scenario.

In the following, four examples of rehabilitation scenarios/activities.

Rehabilitation Scenario/Activity No. 1

The first scenario is directed to unilateral shoulder and elbow movement with a focus on continuous movement precision which may be expressed as an angle, a unit of distance, a percentage, or the like. The continuous movement precision represents the patient's deviation from a target movement axis or path. The present scenario mandates the performance of elementary movements with a certain required degree of continuous movement precision.

The first scenario comprises a 3D underwater background scene, a fish whose 3D position within the background scene is to be controlled by the shoulder and elbow movements of the patient, a sequence of food objects of which the position within the background scene determines a path to be followed by the fish to guide the movements of the patient, an optional piranha that may chase the fish, and optional obstacles for the fish. During the execution of the rehabilitation exercise corresponding to this scenario, an interactive simulation is displayed to the patient. The interactive simulation comprises the fish, food objects, the piranha, and the obstacles inserted into the 3D underwater background scene. The patient moves this shoulder and elbow to move the fish so that the fish eats the food elements and avoids the obstacles. The patient should also control the speed of the fish so that it may not be caught by the piranha.

In one embodiment, the customizable properties/parameters for the first scenario may include:

the size of the food objects that controls the required precision of the patient's movement path;

the shape of the sequence of food objects or the location of the food objects within the 3D underwater background scene in order to control the shape of the user's required movement path. For example, the food objects may be arranged in the shape of a square, a triangle, an eight figure, etc.;

the speed of the piranha to control the minimum speed required by the patient's movements; and/or

the number of movement repetitions required

It should be understood that the customizable properties/parameters for the first scenario may comprise additional properties/parameters such as the location and size of the obstacles.

When the customizable properties/parameters are set by the medical professional, the rehabilitation scenario is then referred to as a rehabilitation activity. For example, the following values may be assigned to the above-customizable properties/parameters:

size of food objects: 2 units diameter:

shape of the sequence of food objects: square;

speed of the piranha: 5 cm per second:

number of movement repetitions required: 3

In one embodiment, the rehabilitation scenario further comprises the following general rehabilitation focuses: continuous movement precision and unilateral shoulder and elbow movement.

While the first exemplary scenario refers to a continuous movement precision, it should be understood that scenarios may also comprise a discrete movement precision that may be expressed as an angle, a unit of distance, a percentage, or the like. A discrete movement precision represents the patient's ability to accurately reach an end-point of a movement within a desired area-threshold. For example, an activity may require a patient to hit a discrete target with their wrist with a certain degree of precision.

Rehabilitation Scenario/Activity No. 2

The second scenario is directed to bilateral shoulder and elbow movement with a focus on executive functioning and bi-lateral coordination. The second scenario features a 3D outdoor garden scene, and comprises a tree, fruits that fall from the tree, two basket elements or halves of a basket that form a basket when brought together, and a pail. During the rehabilitation exercise, the depth and horizontal coordinates of each basket element are controlled by the 3D position of a respective elbow, shoulder, and hand of the patient, and the patient forms the complete basket by bringing his hands together. The patient has to bring his hands together to catch the falling fruits in the basket, bring moves his hands while maintaining his hands in contact together to bring the basket containing the fruit on top of the pail, and then moves his hands apart so that the fruit falls from the basket into the pail.

In one embodiment, the customizable properties/parameters for the second scenario may comprise:

the number of falling fruit;

the size of the basket and pail to set the required precision of patient's movements;

the horizontal and depth positions of falling fruit;

the horizontal and depth position of the pail; and/or

the speed and frequency for the falling fruits.

In one embodiment, the second rehabilitation scenario further comprises the following general rehabilitation focuses: executive functioning and bi-lateral coordination.

Table 1 presents exemplary values for the customizable properties/parameters for the second exemplary scenario. Once the values are assigned to the customizable properties/parameters, the scenario is referred to as an activity.

TABLE 1 Exemplary values for the customizable properties/parameters for the second exemplary scenario. Input parameter Selection Number of falling fruit 5 Size of the basket and pail (diameter) 5 cm Possible depth and horizontal positions of Far-left, far-right, near-right, the falling fruit near-left Depth and horizontal position of the pail Near-center Speed of falling fruit 10 cm/second

Rehabilitation Scenario/Activity No. 3

This scenario is directed to trunk movement while the patient is in a sitting position with an emphasis on posture and balance. During the corresponding rehabilitation exercise, the patient controls the position of a ball using the angle of his trunk in order to hit targets at various horizontal and depth positions with the ball. The third scenario comprises the following virtual elements: a background scene; a ball of which the position is controlled by the angle of tilt of the patient's trunk; and target objects.

In one embodiment, the customizable properties/parameters for the third scenario may comprise:

the size of the target objects which defines the precision requirement for the trunk movements;

the horizontal position of target objects which defines the lateral trunk tilt requirement;

the depth position of target objects which defines the forward trunk tilt requirement; and

the number of target objects.

In one embodiment, the third rehabilitation scenario further comprises the following general rehabilitation focuses: posture, trunk movement, and balance.

Rehabilitation Scenario/Activity No. 4

The fourth scenario is directed to discrete precision of upper-extremity (i.e. arms) to perform task-oriented movements. A task-oriented movement is defined as a movement that may require one or more elementary movements in order to achieve a basic functional movement purpose. Examples of task-oriented movements are reaching, crossing midline (moving from right-side to left side or vise-versa), raising, and diagonal movement. Some task-oriented movements may not have a descriptive label, and may be defined by the beginning position and end position of a certain body part, for example, the beginning and end position of the wrist. A task-oriented movement may also be defined in terms of the position of the user's hand relative to the user's body, for example “ipsilateral to contraproximal” (same side to opposite side and near).

The general interactive mechanic corresponding to the fourth scenario is the following: the user must control an object in the interactive environment by manipulating the position of his wrist in order to hit target objects. The position of the user-controlled object within the interactive environment is determined as a function of the 3D position of the patient's wrist relative to his trunk. During the rehabilitation exercise, the patient manipulates the position of his wrist relative to his trunk in order to collide the user-controlled object with at least one target object within the interactive environment. The fourth scenario comprises the following virtual elements: a background scene, a user-controlled object and at least one target object. The virtual elements may optionally comprise obstacles that move according to a given speed, and should be avoided by the user-controlled object.

In one embodiment, the customizable properties/parameters for the fourth scenario may comprise:

the vertical and horizontal position of target objects;

the depth position of target objects;

the number of target objects; and

the size, frequency, and speed of the target objects.

In one embodiment, the fourth rehabilitation scenario further comprises the following general rehabilitation focus: sequential movement.

Table 2 presents exemplary values for the customizable properties/parameters of the fourth scenario.

TABLE 2 Exemplary values for the parameters of the fourth exemplary scenario. Input parameter Selection Number of target objects 3 vertical and horizontal position of target object 1: top-right objects object 2: bottom-left object 3: bottom-right depth position of target objects object 1: near object 2: far object 3: near Size of target objects (diameter) object 1: 2 cm object 2: 2 cm object 3: 4 cm

It should be understood that, once the customizable properties/parameters have been specified by a medical professional, the rehabilitation scenarios are referred to as rehabilitation activities.

FIG. 1 illustrates one embodiment of a computer-implemented method 10 for analysing a given rehabilitation activity by determining the movements to be executed by a patient that correspond to the given a rehabilitation activity.

For example, a medical professional may create a rehabilitation activity for a patient who is unable, or is not advised, to perform an elementary movement such as a right shoulder flexion, or a task-oriented movement such as “right arm arm-raising”. The method 10 analyses the rehabilitation activity In this case, the method 10 is used by the medical professional to determine the elementary movements and/or the task-oriented movements required by the rehabilitation activity to be performed by the patient. If the results of the method 10 shows that the rehabilitation activity created by the medical professional requires a right shoulder flexion to be performed by the patient, then the medical professional realizes that the rehabilitation exercise he created is not adapted for the patient. Thereby, the use of the method 10 allows to avoid any injury or an aggravation of the patient condition. Referring back to FIG. 1, a rehabilitation activity that has been created or selected by a medical professional is received at step 12. The rehabilitation activity may have been created as described above.

In one embodiment, an identification of the rehabilitation scenario corresponding to the rehabilitation activity is received along with the customized properties/parameters set by the medical professional. The rehabilitation scenario and the customized properties/parameters then form a rehabilitation activity.

At step 14, movement rules that correspond to the received rehabilitation activity are retrieved. In one embodiment, corresponding movement rules are stored in a database for each rehabilitation scenario. The database is stored on a local or external storing unit.

In this case, the rehabilitation scenario corresponding to the received rehabilitation activity is first determined, and then the movement rules corresponding to the determined scenario are retrieved from the database.

Each rehabilitation scenario has a corresponding interactive mechanic which includes: a body-part to virtual-object input mapping; involved body parts, and customizable properties/parameters. Each interactive mechanic is expressed through its associated movement rules.

A movement rule is defined as the correlation between a user-controlled object property in the interactive environment, and a patient elementary movement or task-oriented movement. An elementary movement is defined as a body movement type for a given body part of the patient according to a single degree of freedom. In a sequence of elementary movements, each elementary movement may be characterized by a respective range of movement. Examples of body movement types comprise flexion, extension, adduction, abduction, flexion, rotation, pronation, supination, etc. Examples of body parts comprise a trunk, a joint such as left knee, right shoulder, left elbow, left wrist, and/or the like, etc.

The following presents exemplary movements rules for each one of the four exemplary rehabilitation scenarios described above. It should be understood that the movement axes and directions described in the below tables are based on the right-hand Cartesian coordinate system illustrated in FIG. 2.

Exemplary movement rules for the first above-described rehabilitation scenario are presented in Table 3.

TABLE 3 Exemplary movement rules for the first exemplary rehabilitation scenario. Property of User- Task- controlled Elementary oriented object Direction Body Part movement(s) movement 1 Horizontal Ascending Right Right N/A (X) Shoulder Shoulder Movement Horizontal Plane Abduction 2 Horizontal Descending Right Right N/A (X) Shoulder Shoulder Movement Horizontal Plane Adduction 3 Horizontal Ascending Left Left N/A (X) Shoulder Shoulder Movement Horizontal Plane Adduction 4 Horizontal Descending Left Left N/A (X) Shoulder Shoulder Movement Horizontal Plane Abduction 5 Vertical Ascending Right or Left Shoulder N/A (Y) Shoulder (Forward) Movement Flexion Plane 6 Vertical Descending Right or Left Shoulder N/A (Y) Shoulder (Forward) Movement Extension Plane 7 Depth (Z) Ascending Right or Left Elbow N/A Movement Elbow Flexion Plane 8 Depth (Z) Descending Right or Left Elbow N/A Movement Elbow Extension Plane

In Table 3, for example, if the position of the user-controlled object moves in the ascending direction of the horizontal (X) movement plane, and the body part controlling the user-controlled object is the right shoulder, then the elementary movement is “Right Shoulder Horizontal Abduction”.

Exemplary movement rules for the second above-described rehabilitation scenario are presented in Table 4.

TABLE 4 Exemplary movement rules for the second exemplary rehabilitation scenario. Property of User- Task- controlled Elementary oriented object Direction Body Part movement(s) movement 1 Horizontal Ascending Right Arm 1. Right Undefined (X) Shoulder Movement Horizontal Plane Abduction 2. Shoulder External Rotation 2 Horizontal Descending Right Arm 1. Right Undefined (X) Shoulder Movement Horizontal Plane Adduction 2. Shoulder Internal Rotation 3 Horizontal Ascending Left Arm 1. Left Undefined (X) Shoulder Movement Horizontal Plane Adduction 2. Left Shoulder Internal Rotation 4 Horizontal Descending Left Arm 1. Left Undefined (X) Shoulder Movement Horizontal Plane Abduction 2. Left Shoulder External Rotation 5 Depth (Z) Descending Elbow Elbow Reaching Movement Extension Plane 6 Depth (Z) Ascending Elbow Elbow Undefined Movement Flexion Plane 7 Depth (Z) + No Change Elbow and Stasis Stability Horizontal Shoulder (X) Movement Planes

In Table 4, for example, if the position of the user-controlled object moves in the ascending direction of the horizontal (X) movement plane, and the body part controlling the user-controlled object is the right arm, then the elementary movements are either of or a combination of “Right Shoulder Horizontal Abduction” and “Shoulder External Rotation”.

Exemplary movement rules for the third above-described rehabilitation scenario are presented in Table 5.

TABLE 5 Exemplary movement rules for the third exemplary rehabilitation scenario. Property of User- Task- controlled Elementary oriented object Direction Body Part movement(s) movement 1 Horizontal Ascending Trunk Trunk N/A (X+) Lateral Movement Flexion Plane 2 Horizontal Descending Trunk Trunk N/A (X+) Lateral Movement Extension Plane 3 Horizontal Ascending Trunk Trunk N/A (X−) Lateral Movement Extension Plane 4 Horizontal Descending Trunk Trunk N/A (X−) Lateral Movement Flexion Plane 5 Depth (Z) Ascending Trunk Trunk N/A Movement Forward Plane Extension 6 Depth (Z) Descending Trunk Trunk N/A Movement Forward Plane Flexion

In the above Table 5, where the “Property of User-controlled object” indicates “X+” or “X−”, this refers to whether the interactivity is taking place on the positive or negative side of the zero value, respectively. In Table 5, for example, if the position of the user-controlled object moves in the ascending direction of the positive horizontal (X+) movement plane, and the body part controlling the user-controlled object is the trunk, then the elementary movement is “Trunk lateral flexion”.

Exemplary movement rules for the fourth above-described rehabilitation scenario are presented in Table 6.

TABLE 6 Exemplary movement rules for the fourth exemplary rehabilitation scenario. Property of User- Task- controlled Elementary oriented object Direction Body Part movement(s) movement 1 Horizontal Ascending Right Arm 1. Right Arm crossing (X) Shoulder midline: Movement Horizontal contralateral Plane Abduction to ipsilateral 2. Right Shoulder Abduction 3. Right Shoulder External Rotation 2 Horizontal Descending Right Arm 1. Right Arm crossing (X) Shoulder midline: Movement Horizontal ipsilateral to Plane Adduction contralateral 2. Right Shoulder Adduction 3. Right Shoulder Internal Rotation 3 Horizontal Ascending Left Arm 1. Left Arm crossing (X) Shoulder midline: Movement Horizontal ipsilateral to Plane Adduction contralateral 2. Left Shoulder Adduction 3. Left Shoulder Internal Rotation 4 Horizontal Descending Left Arm 1. Left Arm crossing (X) Shoulder midline: Movement Horizontal contralateral Plane Abduction to ipsilateral 2. Left Shoulder Abduction 3. Left Shoulder External Rotation 5 Vertical Ascending Right or Shoulder Raising (Y) Left Arm forward Movement flexion Plane 6 Vertical Descending Right or Shoulder Lowering (Y) Left Arm forward Movement extension Plane 7 Depth (Z) Ascending Right or 1. Shoulder Reaching Movement Left Arm forward Plane flexion 2. Elbow extension 8 Depth (Z) Descending Right or 1. Shoulder Undefined Movement Left Arm forward Plane extension 2. Elbow flexion

In Table 6, for example, if the position of the user-controlled object moves in the ascending direction of the horizontal (X) movement plane, and the body part controlling the user-controlled object is the right arm, then the elementary movements are one or a combination of “Right Shoulder Horizontal Abduction”, “Right Shoulder Abduction”, and “Right Shoulder External Rotation”, and the task-oriented movement is “Arm crossing midline: contralateral to ipsilateral”.

It should be noted that if in a single row, more than one elementary movement is listed in the “Elementary movement(s)” section, this implies that this movement task may be composed of several combinations of elementary movements. It can use one elementary movement listed, a combination of several elementary movements listed, or all the elementary movements listed.

Table 7a illustrates the movement rules for an interactive environment comprising a ball of which the size is to be controlled by the movement of the user's shoulder forward flexion during the virtual rehabilitation simulation.

TABLE 7a Movement rules for a rehabilitation scenario in which the size of a ball is to be controlled by a patient. Property of User-controlled Elementary object Direction Body Part movement(s) 1 Diameter of ball Increasing Shoulder Shoulder Forward Flexion 2 Diameter of ball Decreasing Shoulder Shoulder Forward Extension

In one embodiment, the movement rules contain only the property of a user-controlled object, a body part, and at least one of an elementary movement or task-oriented movement, as illustrated in Table 7b.

TABLE 7b Another embodiment of movement rules for a rehabilitation scenario in which the size of a ball is to be controlled by a patient. Property of User-controlled object Body Part Elementary movement(s) 1 Diameter of ball Shoulder Shoulder Forward Flexion/ Extension

In Table 7a, for example, if the diameter of the user-controlled object is increasing, and the body part controlling the diameter of the user-controlled object is the shoulder, then the elementary movement is “Shoulder forward flexion”.

Referring back to FIG. 1, step 16 consists in determining a sequence movement events for the received rehabilitation activity. A movement event is defined as a state in the interactive environment, which prompts the beginning or end of a movement required to be performed by the patient in order for the activity to be successfully completed. The required movement may comprise at least one elementary movement, and/or at least one task-oriented movement. A movement event may be triggered by the state of a user-controlled object property, or by any state or change in the interactive environment.

A movement segment is defined as the movement taking place in between two successive movement events. The movement events are used to define the movement segments that the patient should perform in order complete a task in the virtual environment. A movement event marks the point when there is a change in a movement segment such as the beginning or the end of a movement segment.

The following illustrates how movement events are determined for the received rehabilitation activity.

In one embodiment, a movement event is defined by the state of a user-controlled object property, the achievement of which is required in order for the activity to be successfully completed. A movement event may be defined by any property of the user-controlled object such as a position, a size, and a color, or by any state or change in the interactive environment.

If the position of the user-controlled object is controlled by the user's physical movements, a movement event may correspond to a respective position of the user-controlled object.

If the color of the user-controlled object is controlled by the user's physical movements, a movement event may correspond a respective color of the user-controlled object

If the size of the user-controlled object is controlled by the user's physical movements, a movement event may correspond to a respective size of the user-controlled object

In one embodiment, input scenario parameters translate directly into movement events. For example, movement events may each correspond to a respective position within the interactive environment. Referring back to the exemplary scenario no. 4, the vertical, horizontal, and depth positions of the target objects may translate into movement events. Table 8 presents the possible vertical and horizontal positions for the target objects while Table 9 presents the possible depth positions for the target objects.

TABLE 8 Vertical and horizontal positions for target object. Selection - Vertical and horizontal (X-Y) Position requirement of user- position of target objects controlled object Top left −1, 1  Top Center 0, 1 Top Right 1, 1 Middle Left −1, 0  Middle Center 0, 0 Middle Right 1, 0 Bottom Left −1, −1 Bottom Center  0, −1 Bottom Right  1, −1

TABLE 9 Depth positions for target objects. Selection - Depth position of target (Z) Position requirement of user- objects controlled object Near 1 Middle 0 Far −1

If the medical professional selects three target objects and locates them at the vertical and horizontal position and depth position indicated in Table 10, then three movement events are determined and each movement event corresponds to the position of a respective target object as illustrated in Table 10.

TABLE 10 Movement events as a function of target object positions. Vertical and Movement Event Target Object # horizontal position Depth position (X-Y-Z position) 1 Top-right Near 1, 1, 1 2 Bottom-left Far −1, −1, −1 3 Bottom-right Near −1, 1, 1

In another embodiment, movement events are determined via pre-sets. For example, and referring back to the first exemplary scenario, the shape of the sequence of food objects may be customized and each shape selection contains a series of pre-configured movement events. Table 11 presents different examples of movement events determined from the shape of food objects.

TABLE 11 Movement events according to the pattern of food objects. Pre-configured Movement Selection - Pattern of Event List (X-Y-Z Food Objects positions) 1 Horizontal Line Event 1: −1, 0, 1 Event 2: 1, 0, 1 Event 3: −1, 0, 1 2 Vertically-oriented Square Event 1: −1, 1, 1 Event 2: 1, 1, 1 Event 3: 1, −1, 1 Event 4: −1, −1, 1 Event 5: −1, 1, 1 3 Horizontally-oriented Event 1: −1, 0, 1 Square Event 2: 1, 0, 1 Event 3: 1, 0, −1 Event 4: −1, 0, −1 Event 5: −1, 0, 1 4 Vertical Line Event 1: 0, 1, 1 Event 2: 0, −1, 1 Event 3: 0, 1, 1

A movement event may be triggered by any change in the interactive environment, for example, the appearance of an object or the change in color of an object. A movement event may also be expressed as a relative change in position from the previous movement event.

Referring back to the second exemplary scenario, the second scenario illustrates how, for the purposes of anticipating the required movements, rules may be stated to define specifically how multiple types of changes in the interactive environment can trigger movement events and segments, as illustrated in Table 12.

TABLE 12 Movement segments as a function of changes in the interactive environment. Change in Interactive Movement segment Environment triggered 1. Fruit begins falling in Move left and right hands to horizontal and depth specified coordinates coordinates 2. Left and right hand Hold hands static (until objects reach specified object fully drops in basket) position 3. Object is successfully Move left and right hands to caught specified coordinates 4. Left and right hand Move hands away from objects reach specified each other to drop fruit: coordinates Increase right hand user- controlled object on x axis Decrease left hand user- controlled object on x axis

Based on the above, Tables 13 and 14 illustrate how the properties/parameters for the second scenario are translated into movement events:

Table 13 presents exemplary parameters entered by a medical professional for the second exemplary scenario.

TABLE 13 Exemplary parameters for the second rehabilitation scenario. Input parameter Selection Number of falling fruit 2 Depth and horizontal positions of the object 1: far-left (−1, 0, −1) falling fruit object 2: far-right (1, 0, −1) Depth and horizontal positions of the pail Near-center (0, 0, 0)

Table 14 presents the movement segments and the movement events that are determined for the second exemplary activity.

TABLE 14 Segment movements and event movements for the second exemplary activity. Change in environment Movement Segment Description Movement Event Object 1 begins falling Move left and right hands to Event 1 (object 1 coords): −1, specified coordinates 0, −1 Hands reach specified Hold hands static (until Event 2: −1, 0, −1 position object fully drops in basket) Object 1 is caught Move left and right hands to Event 3 (pail coords): 0, 0, 0 specified coordinates Hands reach pail Increase right hand user- Event 4 (relative change of controlled object on x axis right user-controlled Decrease left hand user- object): X + 1 controlled object on x axis Event 4: (relative change of left user-controlled object): X − 1 Object 2 begins falling Move left and right hands to Event 5(object 2 coords): specified coordinates 1, 0, −1 Hands reach specified Hold hands static (until Event 6: 1, 0, −1 position object fully drops in basket) Object 2 is caught Move left and right hands to Event 7 (pail coords): 0, 0, 0 specified coordinates Hands reach pail Move hands away from Event 8 (relative change of each other to drop fruit: right-hand object): X + l Increase right hand user- Event 8: (relative change of controlled object on x axis left-hand object): X − 1 Decrease left hand user- controlled obiect on x axis

At step 18, the sequence of elementary movements and/or task-oriented movements that correspond to the received rehabilitation activity are determined using the movement rules and the movement events.

In one embodiment, a functional task may be further determined from the determined elementary movements. A functional movement or functional task is a sequence of elementary movements required to perform a specific task. For example, functional tasks may comprise brushing teeth, hanging a coat, cooking, self-grooming, playing a sport, and/or the like.

Using the determined movement events, the movement segments are determined. A movement segment is the movement to occur between two successive movement events. The elementary movements and/or task-oriented movements per segment are determined by examining each movement segment, i.e. each two successive movement events in the rehabilitation activity, and assigning a respective movement rule to each movement segment.

As described above, this process takes into consideration the following elements: the property of user-controlled object, the direction of change of property, and the involved body part which is extracted from the rehabilitation scenario.

The movement rule that satisfies the above three parameters is identified, and the corresponding elementary movement or task-oriented movement is then identified.

The following example illustrates how a set of movement events for the first exemplary scenario correspond to clinical objectives in this manner. (For reference, refer to movement rules for scenario #1 described above.)

TABLE 15 Determined elementary movements for the first exemplary scenario. Property of User Movement Event Event controlled Direction Involved Elementary Segment 1 2 Object of Change Body part Movement Segment −1, 1, 1 1, 1, 1 Horizontal Ascending Right Right 1 (X) Shoulder Shoulder Horizontal Abduction Segment 1, 1, 1 1, −1, 1 Vertical (Y) Descending Right Shoulder 2 Shoulder Forward Extension Segment 1, −1, 1 −1, −1, 1 Horizontal Descending Right Right 3 (X) Shoulder Shoulder Horizontal Adduction Segment −1, −1, 1 −1, 1, 1 Vertical (Y) Ascending Right Shoulder 4 Shoulder Forward Flexion

As illustrated in Table 15, elementary movements or movement tasks can be determined for any created activity using its corresponding table of movement rules.

At step 20, the determined elementary movements and/or task-oriented movements are outputted. They may be stored in a memory and/or sent to the machine of the medical professional. The medical professional may then consult the elementary movements to be executed by the patient according to the rehabilitation exercise that he created for the patient. For example, the medical professional may realize that the rehabilitation activity that he created requires the execution of a given elementary movement that is not adequate for the patient. In this case, the medical professional may modify the rehabilitation activity or create a new rehabilitation activity.

In an embodiment in which a rehabilitation activity/scenario comprises a general rehabilitation focus, the general rehabilitation focus may be retrieved from the rehabilitation scenario such as from the metadata thereof, and outputted to the medical professional along with the elementary movements and/or task-oriented movements.

In one embodiment, the method 10 further comprises a step of determining movement ranges for elementary movements, if elementary movements are outputted from the previous step. This is achieved by means of a movement range table, which, for each movement rule, takes any two values of the user-controlled object property relevant to the movement rule, and matches those values with their corresponding angle of rotation for the body part involved.

Table 16 is an exemplary movement range table for the first exemplary scenario.

TABLE 16 Movement range table for the first exemplary scenario. Property of User- Corresponding Corresponding controlled Body Movement Value Value Angle for value Angle for value object part type 1 (a1) 2 (a2) 1 (b1) 2 (b2) Horizontal Right Horizontal −1 0 130′  90′ (X) Movement Shoulder Abduction/ Plane Adduction Horizontal Left Horizontal 1 0 130′  90′ (X) Movement Shoulder Abduction/ Plane Adduction Vertical (Y) Right or Forward 0 1  90′ 120′ Movement Left Flexion/ Plane Shoulder Extension Depth (Z) Right or Flexion/ 0 −1  90′ 160′ Movement Left Extension Plane Elbow

Using the two ranges, any values of the user-controlled object property can be mapped using a range mapping equation. One exemplary range mapping equation is the following of such a range mapping function is as follows:

t = b 1 + ( s - a 1 ) ( b 2 - b 1 ) ( a 2 - a 1 )

where b1 and b2 are angle values, s is comprised in the range [a1, a2], and t is comprised in the range [b1, b2]. The value s included in the range [a1, a2] is linearly mapped to the value t in range [b1, b2].

Referring to Table 16 and given that −1 and 0 of the user-controlled object's Horizontal (X) property corresponds to corresponds to 130′ and 90′, respectively, for the right shoulder, a range mapping function would determine that −0.5 of the user-controlled object's Horizontal (X) property corresponds to 110′ for the right shoulder's horizontal abduction/adduction.

In one embodiment, the method 10 further comprises a step of determining the clinical objectives that correspond to the created rehabilitation activity.

Clinical objectives comprise a movement description list. They may also comprise a movement range for each movement of the list, a movement characteristic, and a general focus.

In one embodiment, the movement description list comprises a list of body parts of which a movement is required in order to complete a specific activity.

In another embodiment, the movement description list comprises a list of elementary movements which are required to complete the specific activity. As described above, an elementary movement is defined as a body part and a movement type according to a single degree of freedom and direction. An example of an elementary movement is shoulder [body part] forward flexion [movement type].

In a further embodiment, the movement description list comprises a list of task-oriented movements (or movement tasks). A movement task may be composed of several combinations of elementary movements. For example, the movement task by the right arm of crossing the midline may possibly be achieved either by performing only external shoulder rotation, performing only shoulder horizontal abduction, or by performing a combination of both external shoulder rotation and shoulder horizontal abduction.

In an embodiment in which the movement description list is composed of elementary movements, a movement range for each elementary movement may optionally be included. For example, shoulder flexion 15′-100′.

In an embodiment in which the clinical objectives comprise movement characteristics for the body-parts or movements, the movement characteristics may be expressed as: a speed, a continuous movement precision such as an average deviation from an ideal path, a reaction time such as the time it takes to react to a certain visual stimuli, and/or the like.

Clinical objectives can be formatted in various ways and in various levels of complexity. Table 17 illustrates various combinations for clinical objectives.

TABLE 17 Exemplary clinical objectives. Movement Body Movement Movement Movement character- Part type tasks ranges istics Combination 1 X Combination 2 X X Combination 3 X X Combination 4 X X X Combination 5 X X Combination 6 X X X Combination 7 X X X X Combination 8 X X X

For further clarity, Table 18 illustrates examples of body parts, movement types, movement characteristics, and general foci, that may be used to create clinical objectives. It should be understood that the list of body parts, movement types, movement characteristics, and general foci contained in Table 16 is not exhaustive and exemplary

TABLE 18 Exemplary body parts, movement descriptions, movement characteristics, and general foci to generate clinical objectives. Movement Description Movement Body Part Movement Characteristics General Upper Mid Lower Task Upper Mid Lower Foci Head Thorax Legs Reaching Head Thorax Legs Balance Neck Trunk Knees Crossing Neck Trunk Knees Bilateral Shoulder Waist Ankles midline Shoulders Waist Ankles coordination Elbows Hip Diagonal Elbows Hip Posture Wrists movement Wrists Executive Hands Ipsi to Hands functioning Fingers contralateral Fingers Sequential Ann Contra to Arms movement ipsidistal Stability Raising Lowering

In one embodiment, the step of generating the clinical objectives is accomplished by iteratively examining though each level configuration of the received rehabilitation activity, and identifying inputs that relate to the movement characteristics such as speed of movement or precision of movement.

After identifying the parameters related to the movement characteristic from the received rehabilitation activity, the parameters are each compared to a respective challenge threshold. If a given one of the parameters exceeds its respective challenge threshold, then list the movement characteristic related to the parameter as a clinical objective. A challenge threshold is a value assigned to a performance characteristic and is used to determine whether a specific performance characteristic value is considered as a clinical objective.

Challenge thresholds may be expressed as raw values. For example, a challenge threshold for speed may be set at 10 cm/second. In this case, a speed requirement of 12 cm/second would exceed the challenge threshold, and therefore would be considered as a clinical objective.

In another embodiment, a challenge threshold may be expressed as a comparison to a patient's previous performance of a specific performance characteristic. A previous performance can be summarized as one data point in many ways, such as average, weighted average, or a specific previous performance. In this case, a challenge threshold may be expressed as any value that can quantify a comparison. For example, a challenge threshold can be expressed as a “percentage change” from a value that reflects the patient's previous speed performance.

In one embodiment, challenge thresholds can be set globally for all body parts, elementary movements, and movement tasks. In another embodiment, challenge thresholds can also be set for individual or groups of body parts, elementary movements, or movement tasks.

Challenge threshold values are stored in a database and can optionally be customized by a clinician. Challenge threshold values can be set for individual patients, or globally for all patients.

The following describe three exemplary determinations of clinical objectives related to movement characteristics.

For example, if the received rehabilitation activity comprises a speed requirement at 6 cm/second, this requirement may be compared to patient's previous performances for the related movement type. If, in previous performances, the patient obtained an average speed of 5 cm/second and the challenge threshold is set to 5%, then it is determined that the speed requirement exceeds the challenge threshold, and the speed is listed as a clinical objectives for the patient.

In another example, if the received rehabilitation activity comprises a precision requirement of 2 cm concerning the maximal deviation from a given path, this requirement may be compared to the patient's previous performances. If, in previous performances, the patient's average path deviation was 2.2 cm and the challenge threshold is set to 5%, it is then determined that the precision requirement exceeds the challenge threshold (it being assumed that the lower the precision requirement, the more challenging). The precision requirement is then listed as a clinical objective.

In a further example, if the received rehabilitation activity comprises a speed requirement of 5 cm/second, this requirement may be compared to the patient's previous performances. If, in previous performances, the patient has an average speed of 8 cm/second and the challenge threshold is 5%, it is determined that the speed requirement is below the challenge threshold, and the speed requirement is not listed as a clinical objective.

In one embodiment, the body parts, movement types, movement tasks, movement ranges, movement characteristics, and clinical objectives can be outputted. They can be provided as a list describing the rehabilitation activity as a whole. For example, the following clinical objectives may be outputted: shoulder horizontal abduction, shoulder horizontal adduction, shoulder forward flexion, and shoulder forward extension. This list is applied to the activity as a whole, and does not indicate which clinical objectives apply to which movement segments.

In another embodiment, the outputs described above can be provided per movement segment. In this manner, the output can be broken down to describe each movement segment in the rehabilitation activity. Table 19 illustrates an example of such an output format.

TABLE 19 Exemplary elementary movements per movement segment. Movement Segment Elementary Movements Segment 1 Shoulder horizontal abduction, Shoulder forward extension Segment 2 Shoulder horizontal adduction Segment 3 Shoulder horizontal abduction, Shoulder forward flexion

In a further embodiment, the outputs described above can be provided per group of two or more movement segments. Table 20 illustrates an example of such an output format.

TABLE 20 Exemplary elementary movements per group of movement segments. Movement Segments Elementary Movements Segments 1, 2 Shoulder horizontal abduction, Shoulder forward extension, shoulder horizontal adduction Segment 3, 4 Shoulder horizontal abduction, Shoulder forward flexion

In one embodiment, the method 10 further comprises a step of generating and outputting alerts regarding potential risk/danger for the patient to perform the received rehabilitation activity. The parameters values for the received rehabilitation activity are compared to a respective danger threshold. If the parameter value exceeds its respective danger threshold, then an alert is generated.

A danger or upper-limit threshold is a value related to a performance characteristic, which is used to determine whether a specific performance characteristic value may be excessive or may present a potential danger/risk to the patient.

Upper-limit thresholds may be expressed as raw values. For example, an upper-limit threshold for speed may be set at 35 cm/second. In this case, a speed requirement of 40 cm/second would exceed the upper-limit threshold, and therefore would be considered excessive or a potential danger for the patient.

In another embodiment, an upper-limit threshold may be expressed as a comparison to a patient's previous performance of a specific performance characteristic. In this case, an upper-limit threshold may be expressed as any value that can quantify a comparison. For example, an upper-limit threshold can be expressed as a “percentage change” from a value that reflects the patient's previous speed performance. A previous performance can be summarized as one data point in many ways, such as average, weighted average, or a specific previous performance.

Upper-limit threshold values can be set globally for all body parts, elementary movements, and movement tasks. Alternatively, upper-limit threshold values can be set for individual or groups of body parts, elementary movements, and movement tasks.

Upper-limit threshold values can optionally be customized by clinicians. Upper-limit threshold value can be set for individual patients, or globally for all patients.

The following describe exemplary determinations of alerts.

For example, if a received rehabilitation activity comprises a speed requirement at 12 cm/second, this requirement may be compared to the patient's previous performances for the related movement type. If, in previous performances, the patient has an average speed of 7 cm/second and the danger threshold for this parameter is 30%, then it is determined that the speed requirement exceeds the danger threshold. In this case, an alert is generated and outputted. The alert may be indicative of the parameter and its value.

In another example, if a received rehabilitation activity comprises shoulder forward flexion at ‘0-140’, this requirement may be compared to the patient's previous performances. If, in previous performances, the patient has only been able to achieve a maximum of 110′ shoulder forward flexion and the danger threshold for this parameter is 20%, then it is determined that the shoulder forward flexion requirement exceeds the danger threshold. In this case, an alert is generated and outputted.

In one embodiment, further information regarding the functional relevance of their newly created activity is outputted to the clinician. Functional tasks are determined from the elementary movements or task-oriented movements determined at step 18. A database comprises a list of functional tasks and the corresponding elementary movements or task-oriented movements for each functional task. The elementary movements or task-oriented movements that are determined at step 18 are compared to the database, and the corresponding functional task, if any, is retrieved.

Table 21 presents one exemplary method for breaking down the functional task “hanging a coat” into its corresponding elementary movements.

TABLE 21 List of elementary movements forming the functional task “hanging a coat” (Shoulder −> TYPE + (Elbow −> TYPE + (Wrist −> TYPE + DEGREE OF DEGREE OF DEGREE OF MOVEMENT) MOVEMENT) MOVEMENT) (Fingers/Thumb) Shoulder Abduction Elbow Flexion Wrist abduction grip (coat hanger) 0′-15′ 0′-90′ 0′-20′ Shoulder Flexion Elbow Extension Wrist adduction static grip 0′-120′ 90′-0′ 20′-(−)20′ Shoulder Stasis - Forearm Rotation Wrist abduction Release Grip Active 90′-100′ (−)20′-0′ Shoulder Extension 120′-0′

FIG. 3 illustrates one embodiment of a system 50 for determining elementary movements from a rehabilitation activity. The system 50 comprises a movement rules determining module 52, a movement events determining module 54, and an elementary movement determining module 56.

The movement rules determining module 52 is adapted to determine the movement rules that correspond to the received rehabilitation activity using the above-described method. The movement events determining module 54 is adapted to determine the movement events that correspond to the received rehabilitation activity using the above described method with respect to step 16 of method 10. The determined movement rules and movement events are received by the elementary movement determining module 56 from the movement rules determining module 52 and the movement events determining module 54, respectively. The elementary movement determining module 56 is adapted to determine the elementary movements or task-oriented movements corresponding to the received rehabilitation activity from the received movement rules and movement events, as described above with respect to step 18 of method 10.

In one embodiment, the system 50 further comprises a clinical objective module (not shown) for generating clinical objectives for the received rehabilitation activity using the above-described method.

In the same or another embodiment, the system 50 further comprises an alert generating unit (not shown) for generating and outputting alerts using the above-described method.

While the method 10 illustrated in FIG. 1 is adapted to determine the list of elementary movements and/or task-oriented movements contained in a rehabilitation scenario. FIG. 4 illustrates one embodiment of a method 100 for analysing the movements executed by a user/patient while performing virtual rehabilitation exercise, i.e. determining the list of elementary movements and/or task-oriented movements executed by the patient while performing the rehabilitation exercise. At step 102, data indicative of the movements executed by the patient while performing the rehabilitation exercise are received. At step 104, the movement rules corresponding to the rehabilitation exercise performed by the patient are retrieved from a memory. At step 106, the movement events contained in the received executed movements are determined. In one embodiment, the movement events comprise movement event triggers.

A movement event trigger is defined as a function that determines when a movement event is triggered, based on one or more prescribed conditions being satisfied. The prescribed condition can be any defined change in state in the interactive environment, for example, the appearance of an object, the collision of two objects, the position of an object, or the change in color of an object. A prescribed condition may also be defined by a relative change in state from the previous movement event.

One example of an embodiment where movement event triggers are used is a rehabilitation exercise in which the patient is requested to reach with his arm to hit a moving target. A movement event trigger is attached to the target, with the defined condition that when the patient's arm hits the target, a movement event shall occur.

When a movement event is triggered, the movement event's value can be assigned as a parameter defined by the movement event's corresponding trigger. The value defined by the movement event trigger can be any property of a user-controlled object, or be any state or change in the interactive environment.

For example, in the rehabilitation exercise during which the patient reaches with his arm to hit a moving target, the movement event trigger assigns the movement event's value as the target object's 3D position at the time the trigger condition is satisfied.

The movement events or movement event triggers can be sent to the interactive simulation generator to aid in patient performance analysis, as below.

In one embodiment, the list of movement event values are predefined and pre-ordered prior to the patient's performance of the rehabilitation activity. In another embodiment, the ordered list of movement events is not defined prior to patient activity performance. In this case, the step of determining the sequence of movement events comprises defining, assigning values, and ordering the movement events based on patient actions during the rehabilitation exercise.

One example of an embodiment where movement events are predefined is where the patient must guide his arm along a pre-set and prescribed path, and no movement choice is required from the user.

One example of an embodiment where movement events are not defined prior to patient performance is a scenario where the patient must reach with his arm to hit a number of targets in 3D space. In this case, the movement event triggers are sent to the interactive simulation, where each movement event trigger corresponds to one target and assigns the value of its corresponding movement event to be the target's position in space. In this example, the patient may choose how many targets to hit, in which order to hit them, and when to hit them. In this manner, the movement events can be ordered by the interactive simulation based on the order in which the patient actually hits the targets.

In one embodiment, movement events can also be triggered by changes in the interactive environment. The following presents an example of an unordered movement event list for such an embodiment: hand reaches object; hand kills bug: and hand successfully avoids obstacle. In this case, the actual properties of the user-controlled object at the time of the movement event are not known until the patient performs the activity. These properties are only recorded during or after the performance of the activity.

In the above embodiment, the list of movement events is sent to the interactive simulation generator that will generate the virtual rehabilitation simulation with which the patient will engage. In this way, while the patient engages in the rehabilitation exercise, the interactive environment generator uses the movement events as markers with which to segment patient performance data.

Table 22 presents an example of an ordered list of movement events. In this case, the first movement event must occur before the second movement event, the second movement event must occur before the third movement event, etc.

TABLE 22 Exemplary list of ordered movement events. Movement Event List (X-Y-Z positions of user-controlled object) 1 −1, 1, 1 2 1, 1, 1 3 1, −1, 1 4 −1, −1, 1 5 −1, 1, 1

Table 23 presents an example of an unordered list of movement events. In this case, the movement events may occur in any temporal order.

TABLE 23 Exemplary list of unordered movement events. Movement Event List (X-Y-Z positions of user-controlled object) −1, 1, 1 1, −1, 1 1, 1, 1 1, −1, 1 −1, 1, 1

In one embodiment, as the patient engages in the exercise (or after the patient engages in the exercise), the system collects performance data, which can be extracted from position data provided by a motion tracking unit, any biometric sensory data, or from the interactive simulation generator. The system subdivides the patient's performance into discrete movement segments based on the successive movement events achieved by the patient's performance and determines the patient performances.

Performance data may include, but is not limited to, speed of movement, fluidity of movement, angular range of motion, trunk forward compensation, trunk lateral compensation, pulse rate, electromyography (EMG) activity, precision of movement, accuracy of movement, reaction time, scapular compensation, and successful movement task completion, and/or the like.

In one embodiment, the performance data comprise the speed of movement, which is determined from the time taken by the user to move his body part between two movement events and the distance between the real-world positions of the user's body part at each movement event.

In one embodiment, the performance data comprise the precision of movement. The precision of movement is determined from the position of the user-controlled object within the background scene during the performance, i.e. the recorded position of the user-controlled object within the 3D space during the simulation. For example, the precision of movement may be determined from the deviation of the path followed by the user-controlled object from the shortest distance straight line path between two movement events.

In one embodiment, the performance data comprise the posterior trunk compensation performed during the movement, which is defined as the distance of forward-back movement performed by the trunk to aid in an upper-body movement.

Based on the above, performance measurements can be provided for each individual segment defined by successive movement events, as illustrated in Table 24.

TABLE 24 Performance data for movement segments. Movement Posterior Lateral Event Movement Trunk Trunk Reaction Values Precision Speed Compensation Compensation Time Segment (−1, 1, 1) → +/−5 cm  5 cm/sec  4 cm  4 cm 2.2 secs 1 (1, 1, 1) Segment (1, 1, 1) → +/−9 cm 10 cm/sec 25 cm  4 cm N/A 2 (1, −1, 1) Segment (1, −1, 1) → +/−6 cm  9 cm/sec 10 cm 25 cm N/A 3 (−1, −1, 1) Segment (−1, −1, 1) → +/−7 cm 12 cm/sec 15 cm 25 cm N/A 4 (−1, 1, 1)

In an embodiment in which the movement events are not ordered, the interactive simulation generator may order the movement events during or after the patient performs the activity, based on how the patient performs the rehabilitation activity. For example, in a scenario where he must clap his hand to kill bugs at various horizontal and vertical coordinates, the patient may choose in which order to kill the bugs. The interactive environment would order the movement events based on the order that the patient chooses to kill the bugs. In this embodiment, the movement segments are generated in real time or after the performance (based on the patient's choice), and the performance measurements are gathered for each movement segment accordingly.

At step 108, the sequence of elementary movements and/or task-oriented movements is determined using the above-described method with respect to method 10. Form the movement events, movement sequences are created, and a respective movement rule is assigned to each movement segment. Then the elementary movement and/or task-oriented movement that corresponds to the respective movement rule is assigned to the movement segment, thereby obtaining the sequence of elementary movements and/or task-oriented movements that were performed by the patient during the rehabilitation exercise.

In one embodiment, the performance measurements are matched with the movement rules for the related scenario to correlate the segment-based performance measurements with specific elementary movements or task-oriented movements. This correlation is illustrated in Table 25 for the first exemplary scenario.

TABLE 25 Elementary movement and patient performances per movement segment for the first exemplary scenario. Movement Involved Posterior Lateral Event Body Elementary Movement Trunk Trunk Reaction Values part Movement Precision Speed Compensation Compensation Time (−1, 1, 1) → Right Right +/−5 cm  5 cm/sec  4 cm  4 cm 2.2 secs (1, 1, 1) Shoulder Shoulder Horizontal Abduction (1, 1, 1) → Right Shoulder +/−9 cm 10 cm/sec 25 cm  4 cm N/A (1, −1, 1) Shoulder Forward Extension (1, −1, 1) → Right Right +/−6 cm  9 cm/sec 10 cm 25 cm N/A (−1, −1, 1) Shoulder Shoulder Horizontal Adduction (−1, −1, 1) → Right Shoulder +/−7 cm 12 cm/sec 15 cm 25 cm N/A (−1, 1, 1) Shoulder Forward Flexion

The output of step 108 comprises is a list of elementary movements, task-oriented movements, or a combination thereof, which the user has performed during the rehabilitation activity. In one embodiment, the output of step 108 further comprises at least one patient performance measurement for at least one of an elementary movement or task oriented movement performed during the rehabilitation exercise.

In this manner, a medical professional can view a list of movements that have been executed by the patient during the rehabilitation exercise. The medical professional may then evaluate which elementary movements or task-oriented movements were performed satisfactorily by the patient during the exercise, and which movements are in need of further clinical attention.

FIG. 5 illustrates one embodiment of a computer-implemented method 150 for creating a rehabilitation activity. At step 152, clinical objectives created by a medical professional are received. From the received clinical objectives, a rehabilitation activity is automatically generated, and from the rehabilitation activity a virtual interactive simulation may be created. The patient may then interact with interactive simulation while executing a rehabilitation exercise.

As described above, clinical objectives may comprise a list of body parts. In another embodiment, clinical objectives may comprise a movement description list, which may include a list of elementary movements, task-oriented movements, or a combination thereof, and corresponding body parts. In a further embodiment, clinical objectives may comprise at least one functional task.

Clinical objectives may also further comprise movement ranges for each elementary movement, movement characteristics for the body-parts and movements involved, as well as a general focus for the activity to be generated.

Tables 26-29 illustrate exemplary clinical objectives. In Table 26, the clinical objectives comprise body parts and respective elementary movements. In Table 27, the clinical objectives comprise body parts and respective task-oriented movements and movement characteristics. In Table 28, the clinical objectives comprise a body part, respective elementary movements and movement characteristics, and a general rehabilitation focus. In Table 29, the clinical objectives comprise body parts and

TABLE 26 First example of received clinical objectives. Body part Movement Description Right Shoulder Horizontal Abduction/Adduction Right Elbow Flexion/Extension

TABLE 27 Second example of received clinical objectives. Body Part Movement Description Movement Characteristic Right arm Crossing the midline: Speed (challenge Contralateral to ipsilateral level: 3/5) Crossing the midline: Discrete Precision Ipsilateral to contralateral Reaching Reaction

TABLE 28 Third example of received clinical objectives. Movement Body Part Movement Description Characteristic General Focus Shoulders Internal/External Rotation Speed Bi-lateral Horizontal Coordination Adduction/Abduction

TABLE 29 Fourth example of received clinical objectives. Body Part Movement Description Range Trunk Forward Flexion 50′ Lateral Flexion 30′ Elbows Flexion/Extension

In an embodiment in which the clinical objectives comprise at least one functional task to be executed by the patient, the method 150 further comprises a step of breaking down the functional task into a sequence of elementary movements and/or task-oriented movements.

In one embodiment, the method 150 further comprises a step of receiving an input indicative of a time of play from the medical professional. For example, the input indicative of the time of play may comprise a number of movement actions in a level, a number of levels in an activity, or the like. In another embodiment, the time of play, such as the number of movement actions in a level or the number of levels in an activity, may be set randomly within a given range.

Referring back to FIG. 5, the method 150 further comprises a step of determining a scenario that correspond to the received clinical objectives, and movement rules that are adapted to the scenario. In one embodiment, more than one scenario are determined for the received clinical objectives, and a given one of the determined scenarios is selected.

In one embodiment, a plurality of scenarios are stored in a database which further comprises movement rules for each scenario. In this case, the elementary movements or the task-oriented movements and their corresponding body parts on which they apply contained in the received clinical objectives are compared to the movements listed in the movement rules for each scenario contained in the database. If all of the elementary movements or task-oriented movements correspond to the movements listed in a given set of movement rules, then the scenario corresponding to the given set of movement rules is selected as being adapted for the received clinical objectives.

In an embodiment in which the received clinical objectives comprise a general rehabilitation focus and the scenarios stored in the database also comprises a corresponding general rehabilitation focus, the selection of the adequate scenarios may be further based on the general rehabilitation focus. In this case, the general rehabilitation focus of the received clinical objectives is compared to the general rehabilitation focus of the scenarios stored in the database. If a match is found, then the corresponding scenario is selected as being adapted to the received clinical objectives.

In one embodiment, the scenario selection method based on the general rehabilitation focus may be used to discriminate between the scenarios that have been selected based on the elementary movements or task-oriented movements. For example, if a given scenario, that has been previously selected as being adequate based on its corresponding elementary movements or task-oriented movements, has a general rehabilitation focus that does not match that of the clinical objectives, then the given scenario may be rejected. Alternatively, the given scenario is confirmed as being adapted for the received clinical objectives.

With reference to the first exemplary clinical objectives of Table 26, it is determined that the first exemplary scenario described above is adequate since the body parts and elementary movements comprised in the clinical objectives match the movement rules 1, 2, 7, and 8 for the first exemplary scenario illustrated in Table 3.

With reference to the second exemplary clinical objectives of Table 27, it is determined that the fourth exemplary scenario described above is adequate since the body parts and task-oriented movements comprised in the clinical objectives match the movement rules 1, 2, and 7 for the fourth exemplary scenario illustrated in Table 6.

With reference to the third exemplary clinical objectives of Table 28, it is determined that the second exemplary scenario described above is adequate since the body part and task-oriented movements comprised in the clinical objectives match the movement rules 1, 2, 3, and 4 for the second exemplary scenario illustrated in Table 4. The general rehabilitation focus of the third exemplary clinical objectives further matches that of the second exemplary scenario.

It should be understood that more than one scenario may be selected to match clinical objectives. For example, the fourth exemplary clinical objectives of Table 29 requires two scenarios so that all objectives be covered. The second exemplary scenario is selected since its movement rules 5 and 6 illustrated in Table 4 satisfy the body-part to movement description correlation related to elbows for the fourth clinical objectives. Furthermore, the third exemplary scenario is also selected since its movement rules 1, 4, and 6, satisfy the body-part to movement description correlation related to trunk for the fourth clinical objectives.

It should also be noted that the first exemplary scenario may be selected instead of the second exemplary scenario for the fourth exemplary clinical objectives since the first exemplary scenario satisfies the same required body-part to movement description correlation. When multiple scenarios match the same required body-part to movement description correlation, a given one of the multiple scenarios may be randomly selected. In another embodiment, the medical professional may be requested to select a given on the multiple scenarios. In a further embodiment, several or all of the relevant scenarios may be selected. In this case, the activities generated according to these scenarios may be subsequently presented to a medical professional who may decide to only select one activity for the patient or at least two activities for the patient.

Referring back to FIG. 5, the next step 156 comprises the step of determining the customizable properties/parameters of the selected scenario(s) to obtain a rehabilitation activity that may be converted into an interactive simulation. Movement events are first generated according to the generated movement rules for the selected scenario, and then the customizable properties/parameters are then determined using the movement events.

In one embodiment, the relevant movement rules for the selected scenario are first identified. In order to accomplish this task, the specific movement rules that satisfy the input's body-part to movement description correlation are isolated.

With reference to the first exemplary clinical objectives, the first exemplary scenario is selected since its movement rules 1, 2, 7, and 8 satisfy the clinical objectives. In one embodiment, the movement events each expressed as a 3D position may be randomly generated, as long as the following conditions are met:

Any relevant scenario-specific rules are satisfied. In the case of the first exemplary scenario:

    • The (x,y,z) coordinates of the movement events do not exceed exterior boundary ranges. These ranges can be defined by the medical profession or be stored in a database;
    • Rules 5 or 6 can never be true in the same sequence of movement events as rules 7 or 8; and
    • The difference between movement events does not exceed a certain minimum. This ‘minimum difference’ can be defined by the medical professional or be stored in a database.

All the movement events together satisfy all the appropriate movement rules (in this case, rules 1, 2, 7, and 8)

In another embodiment, predetermined movement events are stored in a database for each scenario. More than one set of movement events may be stored in the database for a same scenario, and each set of movement events correspond to a respective set of customizable properties/parameters values.

Table 30 illustrates four different sets of movement events for the first exemplary scenario. The first set of movement events correspond to food objects positioned along a horizontal line. The second set of movement events correspond to food objects positioned according to vertically-oriented square. The third set of movement events correspond to food objects positioned according to a horizontally-oriented square. The fourth set of movement events correspond to food objects positioned along a vertical line.

TABLE 30 Exemplary sets of predetermined movement events for the first exemplary scenario. Pre-configured Selection - Shape of Food Movement Event List Object Pattern (X-Y-Z positons) 1 Horizontal Line Event 1: −1, 0, 1 Event 2: 1, 0, 1 Event 3: −1, 0, 1 2 Vertically-oriented Square Event 1: −1, 1, 1 Event 2: 1, 1, 1 Event 3: 1, −1, 1 Event 4: −1, −1, 1 Event 5: −1, 1, 1 3 Horizontally-oriented Event 1: −1, 0, 1 Square Event 2: 1, 0, 1 Event 3: 1, 0, −1 Event 4: −1, 0, −1 Event 5: −1, 0, 1 4 Vertical Line Event 1 : 0, 1, 1 Event 2: 0, −1, 1 Event 3: 0, 1, 1

In the present example, the sets 1 and 3 of predetermined movement events satisfy the relevant movement rules 1, 2, 7, and 8, and are therefore selected. With reference to the second exemplary clinical objectives, the fourth exemplary scenario is selected since its movement rules 1, 2, and 7 satisfy the body-part to movement description correlation. In this case, the movement events may be each expressed as a 3D position and they can be randomly generated, as long as the following conditions are met:

Any relevant scenario-specific rules are satisfied. In the case of the fourth scenario:

    • The x,y,z coordinates of the movement events do not exceed exterior boundary ranges; and
    • The x,y,z coordinates of the movement events are integers between −1 and 1, inclusive

All the movement events together satisfy all the appropriate movement rules (in this case, rules 1, 2, and 7)

With reference to the third exemplary clinical objectives, the second exemplary scenario is selected since its movement rules 1, 2, 3, and 4 satisfy the body-part to movement description correlation. In this case, the movement events may be each expressed as a 3D position and they can be randomly generated, as long as the following conditions are met:

Any relevant scenario-specific rules are satisfied. In the case of the second scenario:

    • The x coordinates of the movement events are integers between −1 and 1, inclusively:
    • The z coordinates of the movement events are integers between −1 and 0, inclusively;
    • The y coordinate of the movement events is always 0;

The odd number movement events in sequence correspond to the location of the falling objects;

    • The even number movement events in sequence correspond to the location of pail; and
    • The even number movement events in sequence shall be always be the same.

All the movement events together satisfy all the appropriate movement rules (in this case, rules 1, 2, 3, 4)

With reference to the fourth exemplary clinical objectives, the third and fourth exemplary scenarios are selected since their movement rules together satisfy the body-part to movement description correlation. In this case, the movement events may be each expressed as a 3D position and a sequence of movement events is generated for each selected scenario.

In the case of the third scenario, movement events may be automatically generated as long as the following conditions are met:

1—If ranges are not specified:

The x,z coordinates of the movement events do not exceed exterior boundary ranges:

The y value always remains 0; and

The difference between the movement events does not exceed a certain minimum. The ‘minimum difference’ can be defined by the clinician or be stored in a database.

2—If ranges are specified as in the actual example:

A movement range table, such as the one illustrated in Table 31, is consulted in order to convert the angle to the movement event value.

TABLE 31 Movement range table for the third exemplary scenario. Property of User- Corresponding Corresponding controlled Body Movement Value Value Angle for Angle for object part type #1 #2 value #1: value #2: Depth (Z) Trunk Forward 0 1 0′ 90′ Movement Flexion Plane Horizontal Trunk Lateral 0 (+/−) 1 0 90′ (X) Movement Flexion Plane

Using these two ranges, the angle values specified in the clinical objectives are mapped to their corresponding movement event values (within the virtual space) of the user-controlled object property using a range mapping formula. One example of such a range mapping function is as follows:

t = b 1 + ( s - a 1 ) ( b 2 - b 1 ) ( a 2 - a 1 )

where b1 and b2 are angle values, s is comprised in the range [a1, a2], and t is comprised in the range [b1, b2]. The value s included in the range [a1, a2] is linearly mapped to the value t in range [b1, b2].

Assuming [t] represents the desired Z value, [b1] represents trunk forward flexion value 1, [b2] represents trunk forward flexion value 2, [a1] represents the corresponding angle for value 1, [a2] represents the corresponding angle for value 2, and S represents the angle input, this would result in the following calculation:


[desired Z value]=0+((50−0)(1−0))/(90−0)


[desired Z value]=0.56

Therefore, Trunk Forward Flexion angle requirement would render a Z value of 0.56.

Assuming [t] represents the desired X value, [b1] represents trunk lateral flexion value 1. [b2] represents trunk lateral flexion value 2, [a1] represents the corresponding angle for value 1, [a2] represents the corresponding angle for value 2, and S represents the angle input, this would result in the following calculation:


[desired X value]=0+((30−0)((+/−)1−0))/(90−0)


[desired X value]=+/−0.33

Therefore, Trunk Lateral Flexion angle requirement would render an X value at +/−0.33.

In the case of the fourth exemplary scenario, the relevant scenario-specific rules that must be satisfied are as follows:

The x,y,z coordinates of the movement events do not exceed exterior boundary ranges as defined in the scenario;

    • The x,y,z coordinates of the movement events are integers between −1 and 1, inclusively; and

All the movement events together satisfy all the appropriate movement rules.

In one embodiment, the movement events correspond directly to customizable parameters of the scenario, and the value of each determined movement event is simply assigned to its corresponding customizable parameter. For example, in the fourth exemplary scenario, the movement events may correspond directly to the (x,y,z) position of the target objects. In this case, the determined positions for the movement events are each assigned to a respective target object.

In another embodiment, the movement events (outputted from the previous step) are converted into actual rehabilitation activity parameters, which may be read by the simulation generator and edited by a clinician.

For example, the second exemplary scenario requires that all odd number movement events be translated into horizontal and depth coordinates for the falling objects, and even number movement events be translated into horizontal and depth coordinates for the pail.

Table 32 illustrates exemplary movement events for the second exemplary scenario.

TABLE 32 Exemplary movement events for the second exemplary scenario. # Movement Event 1 1, 0, 0 2 0, 0, 0 3 −1, 0, −1 4 0, 0, 0

In the second scenario, the movement events can be translated into vertical, horizontal, and depth positions, based on the following conversion Table 33:

TABLE 33 Conversion table for the movement events of the second exemplary scenario. (X-Z) Position of Selection - Horizontal and Depth object position of target objects −1, 0  Left Near 0, 0 Middle Near 1, 0 Right Near −1, −1 Left Far  0, −1 Middle Far  1, −1 Right Far

Consequently, based on the requirements of the second scenario, the following position for the falling fruits and pail are determined:

TABLE 34 Exemplary position for the falling fruits and the pail. Horizontal and Depth Horizontal and Depth positions of falling fruit positions of Pail Right Near Middle Near Left Far

Once a value has been attributed to at least one of the customizable properties/parameters (the other properties/customizable being assigned a predetermined value), the scenario is complete and referred to as a rehabilitation activity. The rehabilitation activity is outputted. For example, the automatically generated activity may be stored in memory for further download by the patient, sent to the physician for consultation or to the interactive simulation generator.

In one embodiment, the method 150 further comprises a step of making adjustments related to movement characteristics required and difficulty. This is achieved by consulting the input related to movement characteristics.

For each movement characteristic listed within the input, an associated ‘quality of movement’ parameter is set. This can be achieved by:

If a challenge level is not expressed: selecting a “movement characteristic” value that falls between a predetermined ‘challenge threshold’ and a predetermined ‘upper-limit threshold’.

If a challenge level is expressed: selecting a “movement characteristic” value that falls within a sub-range between a predetermined ‘challenge threshold’ and a predetermined ‘upper-limit threshold’, the sub-range being defined by the inputted challenge level. Specifically, a lower ‘challenge level’ would result in a sub-range closer to the challenge threshold, and a higher challenge level would result in a sub-range closer to the ‘upper-limit threshold’.

The sub-ranges between the ‘challenge threshold’ and ‘upper-limit threshold’ (defined by the ‘challenge level’) can be derived through a linear or non-linear calculation.

The following describe three examples for how this method is used to generate activity configurations based on clinical objectives related to movement characteristics.

Example 1

If, for example, “speed” is listed as a clinical objective, the method may verify the patient's current ability in terms of speed, and adjust the speed requirement of the activity in order for it to be adequately challenging for the patient. It may be determined that in previous performances, patient has averaged 10 cm/second, the challenge threshold is 5%, and the upper-limit threshold is 30%. With this information, a speed requirement randomly in between 10.5 and 13 cm/second may be selected.

Example 2

If, for example, “precision” is listed as a clinical objective, the method may verify the patient's current ability in terms of precision, and adjust the precision requirement of the activity in order for it to be adequately challenging for the patient. It may be determined that in previous performances, patient's average path deviation was 3 cm, the challenge threshold is 5%, and the upper-limit threshold is 30%. With this information, a precision requirement randomly in between 2.85 cm and 2.1 cm is selected (it being assumed that the lower the precision requirement, the more challenging).

Example 3

If, for example, “speed (challenge level: 2/5)” is listed as a clinical objective, the method may verify that the ‘challenge threshold’ in terms of speed is 10 cm/second, and the upper-limit threshold in terms of speed is 35 cm/second. As the challenge level is 2/5, a speed requirement randomly between the sub-range of 15 and 20 cm/second may be selected.

It should be understood that the method 150 may be embodied as a system comprising a scenario determining module, a movement events determining module, and an scenario parameter determining module. The scenario determining module is adapted to receive the clinical objectives and determine a corresponding scenario and corresponding movement rules, as described above. The movement events determining module is adapted to receive the movement rules and determine movement events adapted for the selected scenario, as described above. The scenario parameter determining module is adapted to receive the selected scenario and the determined movement events, and determine the value for the scenario customizable parameters in order to obtain a complete rehabilitation activity.

While the present description refers to the steps of receiving a rehabilitation activity and receiving clinical objectives both from a medical professional, it should be understood that other embodiments may be possible. For example, the rehabilitation activity and/or the clinical objectives may be received from a performance analysis system, a processing machine, and/or the like.

Although the above description relates to specific embodiments as presently contemplated by the inventors, it will be understood that the invention in its broad aspect includes hardware and functional equivalents of the elements described herein. Moreover, although the invention has been described in a particular application, it should be understood that the invention may be used in various other applications.

Claims

1. A computer-implemented method for analysing one of a rehabilitation activity and a performance of a user during a virtual rehabilitation exercise, comprising:

receiving one of a rehabilitation activity and executed movements performed by the user during the virtual rehabilitation exercise, the rehabilitation activity defining an interactive environment to be used for generating a simulation that corresponds to the virtual rehabilitation exercise, the rehabilitation activity comprising at least one virtual user-controlled element and input parameters;
determining movement rules corresponding to the one of the rehabilitation activity and the rehabilitation exercise; each one of the movement rules comprising a correlation between a given group consisting of at least a property of the virtual user-controlled element and a body part, and at least one of a respective elementary movement and a respective task-oriented movement;
determining a sequence of movement events corresponding to the one of the rehabilitation activity and the executed movements, each one of the movement events corresponding to a given state of the property of the virtual user-controlled object in the interactive environment, the given state corresponding to one of a beginning and an end of a movement;
determining a movement sequence comprising at least one of elementary movements and a task-oriented movement using the movement rules and the movement events; and
outputting the movement sequence.

2. The computer-implemented method of claim 1, wherein said receiving comprises receiving the rehabilitation activity.

3. The computer-implemented method of claim 2, wherein said determining movement rules comprises:

determining a rehabilitation scenario that corresponds to the received rehabilitation activity; and
retrieving from a database the movement rules that correspond to the determined rehabilitation scenario.

4. The computer-implemented method of 2, wherein said determining a sequence of movement events comprises determining the movement events from at least one of the input parameters.

5. The computer-implemented method of claim 2, wherein said determining a sequence of movement events comprises retrieving predefined movement events from a storing unit.

6. The computer-implemented method of claim 2, wherein said determining a movement sequence comprising at least one of elementary movements and a task-oriented movement comprises:

determining movement segments from the movement events;
assigning a respective one of the movement rules to each one of the movement segments; and
assigning at least one of the respective elementary movement and the respective task-oriented movement contained in the assigned movement rule to the each one of the movement segments.

7. The computer-implemented method of claim 2, further comprising determining and outputting at least one clinical objective corresponding to the received rehabilitation activity.

8. The computer-implemented method of claim 7, wherein said determining the at least one clinical objective comprises:

comparing a given one of the input parameters to a challenge threshold; and
when the given one of the input parameters is greater than the challenge threshold, identifying a movement characteristic related to the given one of the input parameters as being the at least one clinical objective.

9-10. (canceled)

11. The computer-implemented method of claim 1, wherein said receiving comprises receiving the executed movements performed by a user during a rehabilitation exercise.

12. The computer-implemented method of claim 11, wherein said determining movement rules comprises retrieving movement rules corresponding to the rehabilitation exercise.

13. The computer-implemented method of claim 11, wherein said determining a sequence of movement events comprises retrieving a sequence of ordered movement events corresponding to the rehabilitation exercise.

14. The computer-implemented method of claim 11, wherein said determining a sequence of movement events comprises receiving unordered movement event triggers corresponding to the rehabilitation exercise, and ordering the unordered movement event triggers, thereby obtaining ordered movement events.

15. The computer-implemented method of claim 11, wherein said determining a movement sequence comprising at least one of elementary movements and a task-oriented movement comprises:

determining movement segments from the movement events;
assigning a respective one of the movement rules to each one of the movement segments; and
assigning at least one of the respective elementary movement and the respective task-oriented movement contained in the assigned movement rule to the each one of the movement segments.

16. The computer-implemented method of claim 1, wherein the given group further comprises a direction of change for the property.

17. A system for analysing one of a rehabilitation activity and a performance of a user during a virtual rehabilitation exercise during which the user performs executed movements, the rehabilitation activity defining an interactive environment to be used for generating a simulation that corresponds to the virtual rehabilitation exercise, the rehabilitation activity comprising at least one virtual user-controlled element and input parameters, the system comprising:

a movement rules determining module for determining movement rules corresponding to one of the rehabilitation activity and the virtual rehabilitation exercise; each one of the movement rules comprising a correlation between a given group consisting of a property of the virtual user-controlled element and a body part, and at least one of a respective elementary movement and a respective task-oriented movement;
a movement events determining module for determining a sequence of movement events corresponding to one of the rehabilitation activity and the executed movements, each one of the movement events corresponding to a given state of the property of the virtual user-controlled object in the interactive environment, the given state corresponding to one of a beginning and an end of a movement; and
an elementary movement determining module for determining and outputting a movement sequence comprising at least one of elementary movements and a task-oriented movement using the movement rules and the movement events and outputting the sequence of elementary movements.

18. The system of claim 17, wherein the movement rules determining module is adapted to receive the rehabilitation activity.

19. The system of claim 18, the movement rules determining module is adapted to determine a rehabilitation scenario that corresponds to the received rehabilitation activity, and retrieve from a database the movement rules that correspond to the determined rehabilitation scenario.

20. The system of claim 18, wherein the movement rules determining module is adapted to determine the movement events from at least one of the input parameters.

21. The system of any one of claim 18, wherein the movement events determining module is adapted to retrieve predefined movement events from a storing unit.

22. The system of claim 18, wherein the elementary movement determining module is adapted to:

determine movement segments from the movement events;
assign a respective one of the movement rules to each one of the movement segments; and
assign at least one of the respective elementary movement and the respective task-oriented movement contained in the assigned movement rule to the each one of the movement segments.

23. The system of claim 18, further comprising a clinical objective module for determining and outputting at least one clinical objective corresponding to the received rehabilitation activity.

24. The system of claim 23, wherein the clinical objective module is adapted to:

compare a given one of the input parameters to a challenge threshold; and
when the given one of the input parameters is greater than the challenge threshold, identify a movement characteristic related to the given one of the input parameters as being the at least one clinical objective.

25-26. (canceled)

27. The system of claim 17, wherein the movement rules determining module is adapted to receive the executed movements performed by a user during a rehabilitation exercise.

28. The system of claim 27, wherein the movement rules determining module is adapted to retrieve movement rules corresponding to the rehabilitation exercise.

29. The system of claim 27, wherein the movement events determining module is adapted to retrieve a sequence of ordered movement events corresponding to the rehabilitation exercise.

30. The system of claim 27, wherein the movement events determining module is adapted to receive unordered movement event triggers corresponding to the rehabilitation exercise, and order the unordered movement event triggers in order to obtain ordered movement events.

31. The system of claim 27, wherein the elementary movement determining module is adapted to:

determine movement segments from the movement events;
assign a respective one of the movement rules to each one of the movement segments; and
assign at least one of the respective elementary movement and the respective task-oriented movement contained in the assigned movement rule to the each one of the movement segments.

32. The system of claim 17, wherein the given group further comprises a direction of change for the property.

Patent History
Publication number: 20160023046
Type: Application
Filed: Mar 13, 2014
Publication Date: Jan 28, 2016
Inventors: Mark Evin (Montreal), Julie Gueho (Montreal), David Schacter (Montreal), Alexis Youssef (Westmount), Sung Jun Bae (Montreal)
Application Number: 14/774,960
Classifications
International Classification: A63B 24/00 (20060101);