Computerized assessment system and method for assessing opinions or feelings

A computerized assessment system and method may be used to assess opinions or feelings of a subject (e.g., a child patient). The system and method may display a computer-generated face image having a variable facial expression (e.g., changing mouth and eyes) capable of changing to correspond to opinions or feelings of the subject (e.g., smiling or frowning). The system and method may receive a user input signal in accordance with the opinions or feelings of the subject and may display the changes in the variable facial expression in response to the user input signal. The system and method may also prompt the subject to express an opinion or feeling about a matter to be assessed.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of co-pending U.S. Provisional Patent Application Ser. No. 60/634,709, filed on Dec. 9, 2004, which is fully incorporated herein by reference.

STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT

This invention was made with government support under SBIR grant No. 2 R44 NS042387-02 awarded by the National Institutes of Health. The Government has certain rights in the invention.

TECHNICAL FIELD

The present invention relates to methods for assessing opinions or feelings and more particularly, relates to a computerized assessment system and method for assessing opinions or feelings by allowing a subject to dynamically adjust a variable facial expression on a computer-generated face image.

BACKGROUND INFORMATION

Studies have been performed using static facial expressions to assess a child's self-reported pain. Instruments used in the studies include a small set of cartoon faces or photographs to represent different degrees of pain. In a common application, the Faces Pain Rating Scale, six cartoons are employed and the child (e.g., age 3 yrs or older) is told that the faces are pictures of someone who is very happy because he doesn't hurt at all, hurts just a little bit, hurts even more, hurts a whole lot, and hurts as much as you can imagine. As shown in FIG. 1, each of the faces 110a-110f includes an outline of the head containing a mouth, nose, two eyes, and eyebrows. The faces 110a-110f may be distinguished one from the other by the shape of the mouth (smiling, neutral, or frowning), by the location of the eyebrows (curving down, curving up, or touching the tops of the eyes) and by the character of the eyes (light or dark, no tears, tears). The child is asked to choose the one static face that best describes how he/she is feeling at the moment.

These and related studies are described in various publications, incorporated herein by reference and identified as follows:

Beyer, J. E. (1984). “The Oucher” a User's Manual and Technical Report. Evanston, Ill.: Judson Press.

Bieri, D., and others (1990). The Faces Pain Scale for the self-assessment of the severity of pain experienced by children: development, initial validation, and preliminary investigation for the ratio scale properties, Pain, 41 (2), 139-150.

Buchanan, L., Voigtman, J., & Mills, H. (1997). Implementing the agency for health care policy and research pain management pediatric guideline in a multicultural practice setting. J. Nurs. Care Qual., 11(3), 23-35.

Keck, J. F., Gerkensmeyer, J. E., Joyce, B. A., & Schade, J. G. (1996). Reliability and validity of the faces and word descriptor scales to measure procedural pain. Journal of Pediatric Nursing, 11(6), 368-374.

McGrath, P. A., & Gillespie, J. (2001). Pain assessment in children and adolescents (pp. 97-118). In (D. C. Turk & R. Melzack, Eds.) Handbook of pain assessment, 2nd edition, The Guilford Press: New York.

McRae, M. E., Rourke, D. A., Imperial-Perez, F. A., Eisenrigh, C. M., & Ueda, J. N. (1997). Development of a research-based standard for assessment, intervention, and evaluation of pain after neonatal and pediatric cardiac surgery. Pediatric Nursing, 23(3), 263-271.

Sporrer, K. A., Jackson, S. M., Agner, S., Laver, J., & Abboud, M. R. (1994). Pain in children and adolescents with sickle cell anemia: a prospective study utilizing self-reporting. The American Journal of Pediatric Hematology/Oncology, 16(3), 219-224.

Tyler, D., Douthit, A., & Chapman, C. (1993). Toward validation of pain measurement tools for children: a pilot study. Pain, 52, 301-309.

West, N., Oakes, L., and others (1994). Measuring pain in pediatric oncology ICU patients. Journal of Pediatric Oncology Nursing, 11(2), 64-68.

Wong, D. L. (1999). Whaley & Wong's nursing care of infants and children (6th edition). St. Louis, Mo.: Mosby Year-Book.

Wong, D. L., & Baker, C. (1988). Pain in children: comparison of assessment scales, Pediatr. Nurs. 14(1), 9017.

The static methods described in these publications raise questions about whether they fulfill the assumptions of an interval scale of measurement, employ an optimum mode of implementation, and provide a usable guide for the interpretation of results.

One measurement problem is that there is no continuous variable that is associated with the facial expression and that changes systematically as the emotion being depicted ranges from the extremes of “no hurt” to “hurts worst.” Thus, there is no objective measure associated with the emotional or painful intensity being indicated. Another problem is that the number and type of features depicted in each face is not always the same. In FIG. 1, for example, eyelids are not present in faces 110a-110c but are present in faces 110d-110f; and tears are present in face 110f but not in the other faces 110a-110e. Thus, there is no single change in the face that is uniquely linked to differences in the emotional intensity supposedly depicted. A further problem is that the Face Scale is asymmetric. There are two smiling faces 110a, 110b, one neutral face 110c, and three frowning faces 110d-110f. The investigators (Wong and Baker, 1988) who developed the Face Scale for measuring pain believe that the subjective differences between each face (in terms of pain) are equal, and hence, that the scale is an interval scale permitting the use of parametric statistics (e.g., means and standard deviations of ratings provided by groups of respondents). However, there appears to be no published (and peer reviewed) evidence to support this assumption.

One implementation problem is that the verbal instructions given to the child may not match the facial expression being expressed. For example, the face 110b labeled by the investigator as “hurts little bit” is depicted as smiling, not frowning. Another implementation problem is that such methods do not allow the child to continuously change or fine-tune their judgments before deciding on a final judgment. A further implementation problem is that static methods do not allow for the recording of dynamic changes over time. The specific nature of these dynamic changes may provide insight into the judgment strategy being followed by the child, as well as provide a sensitive measure of the child's response to treatment. Another implementation problem is that ratings associated with the selected faces are not automatically stored in a computer file for later analysis. The ratings must be entered into the computer by hand, thus increasing the time burden on the investigator or clinician, and increasing the chance of error in the input of data.

The confluence of the measurement and implementation problems suggests that the Faces Pain Rating Scale, as well as the closely related picture scales described in the works cited here, can only be considered a “nominal” scale in that no measure of quantitative difference can be inferred from the child's choice of one face over the others ((Baird, J. C. & Noma, E. (1978), Fundamentals of Scaling and Psychophysics, Chap. 1, John Wiley & Sons: New York; Stevens, S. S. (1946) On the theory of scales of measurement. Science, 103, 677-680.) In the case of assessing a child's pain, the nominal scale only allows one to make statements of the following sort: “the pain level represented by this number (face) is different from the pain level represented by some other number (face) in the series.” The relative intensity of pain through comparison of one face with another may not be assessed.

BRIEF DESCRIPTION OF THE DRAWINGS

These and other features and advantages will be better understood by reading the following detailed description, taken together with the drawings wherein:

FIG. 1 is an illustration of the static faces on a conventional faces pain rating scale.

FIG. 2 is a diagrammatic illustration of a computerized assessment system for assessing feelings or opinions by allowing dynamic adjustment of a variable facial expression on a computer-generated face image, consistent with one embodiment of the present invention.

FIG. 3 is a diagrammatic illustration of a computerized assessment system for assessing feelings or opinions implemented on a desktop computer, consistent with one embodiment of the present invention.

FIG. 4 is a diagrammatic illustration of a computerized assessment system for assessing feelings or opinions implemented on a handheld computer, consistent with another embodiment of the present invention.

FIGS. 5 and 6 are schematic block diagrams of a computerized assessment system for assessing opinions or feelings, consistent with different embodiments of the present invention.

FIG. 7 is an illustration of a range of possible facial expressions of the computer-generated face image, consistent with one embodiment of the present invention.

FIG. 8 is a graphical illustration of an algorithm used to change a facial expression on the computer-generated face image, consistent with one embodiment of the present invention.

FIG. 9 is a flow chart illustrating a method of assessing opinions or feelings, consistent with one embodiment of the present invention.

FIGS. 10A-10C are illustrations of computer-generated face images together with different graphical representations of toys being assessed, consistent with one application of the present invention.

FIGS. 11A-11C are illustrations of computer-generated face images together with different graphical representations of activities in a medical facility being assessed, consistent with another application of the present invention.

FIG. 12 is an illustration of a computer-generated face image together with a video being assessed, consistent with a further application of the present invention.

DETAILED DESCRIPTION

Referring to FIG. 2, a computerized assessment system 200 may be used to assess feelings or opinions of a subject by allowing the dynamic adjustment of a variable facial expression on a computer-generated face image 210. The computerized assessment system 200 may be used to assess any feelings or opinions of a subject including, but not limited to, pain, anxiety, fear, happiness/sadness, pleasure/displeasure, and likes/dislikes. The computerized assessment system 200 may be used to assess feelings or opinions of a child or other individual (e.g., an adult with a reading disability). The computerized assessment system 200 may be used by the subject alone or together with one or more individuals conducting the assessment. One exemplary application for the computerized rating system 200 is to assess the pain felt by a child in a medical facility. Other applications are also described in greater detail below.

The computerized assessment system 200 may include a display 202 for displaying the computer-generated face image 210 and a user input device 204 for controlling the dynamic adjustment of the variable facial expression on the face image 210. The user input device 204 may provide user input signals that cause the display 202 to change the facial expression on the face image 210. The face image 210 may include a two-dimensional schema of at least a head 212, mouth 214, eyes 216, and a nose 218, which may be proportionally sized and translated within the x-y plane. Those skilled in the art will recognize that the face image 210 may be represented in other ways (e.g., a more detailed three-dimensional image).

According to one embodiment, the variable facial expression of the face image 210 may be dynamically adjusted by changing the upward and downward curvature of line representing the mouth 214 (i.e., smiling and frowning) and the opening and closing of the lines representing the eyes 216. These facial features may be changed without changing the circle representing the head 212 and the line representing the nose 218. The user input device 204 may include one or more controls 230, 232 to control the dynamic changes in the expression of the face image 210. For example, one control 230 may provide a positive user input signal that causes the face 210 to change positively (e.g., mouth 214 curves upward and eyes 216 open) and another control 232 may provide a negative user input signal that causes the face 210 to change negatively (e.g., mouth 214 curves downward and eyes 216 close). The controls 230, 232 may include up and down arrows corresponding to the direction of movement. Those skilled in the art will recognize that other features may also be dynamically adjusted (e.g., changing eyebrows or tears from the eyes).

FIGS. 3 and 4 show computerized assessment systems 300, 400 consistent with different embodiments of the present invention. The computerized assessment systems 300, 400 may stand alone or may be coupled to a network, for example, using either a wired or wireless connection.

The computerized assessment system 300, consistent with one embodiment, may be implemented using a personal computer (e.g., a desktop or a laptop computer) including a display 302 and one or more input devices 304, 304a (e.g., a mouse, a keyboard, joystick, or a separate remote control device) coupled to the personal computer. The user (e.g. a subject being assessed or individual conducting the assessment) may depress controls (e.g., buttons 330, 332 on a mouse or keys 330a, 332a on a keyboard) on the user input device 304, 304a to change the expression of a computer-generated face image 310 displayed on the computer display 302. Although a mouse user input device 304 and a keyboard user input device 304a are shown, the user input device may also include a remote control or a wireless device (not shown) communicating with the personal computer using a wireless protocol. The user input device may also include a joystick (not shown) that the user moves in different directions to provide the user input signals.

The computerized assessment system 400, consistent with another embodiment, may be implemented using a handheld computer (e.g., a personal digital assistant) including a display 402 and an input device 404 located on the handheld computer. For example, the display 402 may be a touch screen display, and the input device 404 may include control images 430, 432 (e.g., arrows) displayed on the display 402. The user may touch the control images 430, 432 on the display 402 (e.g., using a stylus) to change the expression of a computer-generated face image 410 displayed on the display 402. Alternatively, the input device 404 may be implemented using other controls (e.g., keys, push buttons, rollers) located on the handheld computer.

The computerized assessment system and method may also prompt the subject to provide an assessment in response to a target stimulus, for example, by providing a visual or audible representation of an item, activity, or concept for which the assessment is to be made. The exemplary computerized assessment systems 300, 400, for example, may display an image 320, 420 on the display 302, 402 in proximity to the computer-generated face image 310, 410. The image 320, 420 may be a photograph, drawing or video depicting the item, activity or concept. Alternatively or additionally, the computerized assessment systems 300, 400 may play an audio clip describing or representing an item, activity, or concept. According to a further alternative, a separate image or audio representation may be provided (e.g., using another device) instead of or in addition to the image and/or audio clip provided by the computerized assessment systems 300, 400. An individual may also provide a verbal query or description of an item, activity, or concept instead of or in addition to the image and/or audio clip provided by the computerized assessment systems 300, 400. Examples of items, activities, or concepts that may be assessed are described in greater detail below.

Referring to FIGS. 5 and 6, the computerized assessment system may be implemented using software executed by a computing device. In one embodiment, the assessment software 520 resides on a stand-alone general purpose computer 510 (FIG. 5), such as a PC or handheld computer, which allows the user to access the software 520. Files including images, videos, or audio clips used to prompt the assessment may also be stored on the general purpose computer 510. A display 502 for displaying the computer-generated face image and a user input device 504 for controlling the variable facial expression may be coupled to the stand-alone general purpose computer 510.

In another embodiment, the assessment software 620 resides on a server computer 612 (FIG. 6) and is accessed using a computer 610 connected to the server computer 612 over a data network 630, such as a local area network, a wide area network, an intranet, or the Internet. Files including images, videos, or audio clips used to prompt the assessment may also be stored on the server computer 612. A display 602 for displaying the computer-generated face image and a user input device 604 for controlling the variable facial expression may be coupled to the general purpose computer 610.

The software 520, 620 can be implemented to perform the functions described herein using programming techniques known to a programmer of ordinary skill in the art. For example, the assessment software 520 on the stand-alone computer 510 can be developed using a programming language such as Basic, and the assessment software 620 residing on the server computer 610 can be developed using a programming language such as Java.

Embodiments of the software may be implemented as a computer program product for use with a computer system. Such implementation includes, without limitation, a series of computer instructions that embody all or part of the functionality described herein with respect to the assessment system and method. The series of computer instructions may be stored in any machine-readable medium, such as semiconductor, magnetic, optical or other memory devices, and may be transmitted using any communications technology, such as optical, infrared, microwave, or other transmission technologies. It is expected that such a computer program product may be distributed as a removable machine-readable medium (e.g., a diskette, CD-ROM), preloaded with a computer system (e.g., on system ROM or fixed disk), or distributed from a server or electronic bulletin board over the network (e.g., the Internet or World Wide Web). Alternative embodiments of the invention may be implemented as pre-programmed hardware elements or as a combination of hardware, software and/or firmware.

FIG. 7 illustrates seven of the numerous possible expressions of a computer-generated face image 710a-710g, consistent with one embodiment of the present invention. By activating the controls (e.g., up and down arrows) of a user input device, the user (e.g., a child subject) can change the mouth in successive steps from a state of smiling (e.g., face image 710a) to a neutral state (e.g., face image 710d) to a state of frowning (e.g., face image 710g). At the same time, the eyes may vary from completely open (associated with the smiling face image 710a) to half open (associated with neutral face image 710d) to completely closed (associated with frowning face image 710g). The user may change the facial expression in either direction (positive or negative) multiple times until satisfied.

In one application, the background color of the face image (e.g., white) and the outline of the face and nose (e.g., blue) may not change with the facial expression. The color of the mouth and eyes may be associated with the expression that is being depicted. For example, all features of the face may be “blue” when the mouth is in the neutral position. For all expressions indicating a smile, the mouth and eyes may be “green”, and for all expressions indicating a frown, the mouth and eyes may be “red.” These colors can be changed so long as they do not interfere with the visibility of either the fixed or dynamic facial features. Those skilled in the art will recognize, however, that other colors are possible.

A computerized assessment system and method may also assign rating values to the different facial expressions and may record rating values associated with a selected facial expression. The ratings are thus associated with the opinion or feeling of the subject. FIG. 7 shows one example of scaled ratings associated with each of the facial expressions 710a-710g, although such rating values may or may not be displayed with the computer-generated face image. In this example, the rating values range from 99 to −99 with positive values associated with smiling faces 710a-710c and negative values associated with frowning faces 710e-710g. As shown, the expressions may be equally spaced (e.g., 33 units) along an objective scale. The ratings may be recorded, for example, by the computer used to make the assessment (e.g., general purpose computer 510 in FIG. 5) or by another computer on a network (e.g., server computer 612 in FIG. 6).

Referring to FIG. 8, one method of generating changes in facial expression and assigning a rating indicating the subject's opinion or feeling is described in greater detail. An x-y coordinate system 810 may be situated with its origin (zero) at the center of a line 812a, 812b representing the mouth. A positive quadratic equation may be used to generate different degrees of smiling and a negative quadratic equation may be used to generate different degrees of frowning. The maximum smile (line 812a) and the maximum frown (line 812b) are depicted in FIG. 8 to illustrate the extremes of facial expressions that can be generated.

The degree of smiling may be indicated by different curvatures according to the following equation:
y=λx2   (1)
where x corresponds to a point along the mouth (i.e., along the x axis) and λ is a scalar multiplier. The value x may range from the left corner (minimum x value) to the center (zero x value) and out to the right corner (maximum x value). The variable y represents the curvature of the mouth (i.e., in the y axis) from the left corner to the right corner, which may be essentially continuous depending on the step size for x. By changing the value of the scalar multiplier λ, the amount of smiling can be varied from neutral to its maximum. By changing the magnitude of the scalar multiplier λ in the equation by small incremental steps, essentially continuous changes in expression can be produced. By changing the magnitude of the scalar multiplier λ in the equation by larger incremental steps, larger, discrete successive changes are possible. The value of the scalar multiplier λ may be under user control, and the user input signals generated by a user input device may be used to increase and decrease the value. The step size determines the degree of change in the facial expression accompanying the activation of the input device.

In the same manner, a quadratic equation may be used to generate different degrees of frowning, except that the sign of the expression is negative, as shown in the following equation:
y=−λx2   (2)
For this case, as the scalar λ increases, the degree of frowning approaches its maximum.

According to this method, the value of the scalar λ (or some number that is a transformation of λ) may be used as a rating value. In this manner, the range of scalar values from λ to −λ may represent an entire spectrum of opinions or feelings from extremely positive (e.g., no pain, happiness, pleasure) to extremely negative (e.g., pain, sadness, displeasure). Thus, at each point in time during the adjustment process, the value of the scalar indicates the rating, and the ratings may change dynamically with the changes in the facial expression. When the subject stops changing the facial expression, the final rating may be taken to indicate the user's final opinion or feeling in regards to the target stimulus or the concept represented by the target stimulus. In one embodiment, a convenient range of values for the ratings can be obtained by the appropriate scaling of the scalar λ to create a range of possible scores or ratings from −100 to +100.

The opening and closing of the eyes on the face image may be scaled in magnitude with the smiling and frowning of the mouth. The eyes may thus range from ‘completely open’ when the mouth shows maximum smiling, to ‘half open’ when the mouth is neutral, and then to ‘completely closed’ when the mouth shows maximum frowning.

Those skilled in the art will recognize that other techniques may be used to dynamically change the facial expression. For example, a series of still images may be displayed, each having an incremental change in the facial feature(s) (e.g., the mouth and eyes). The still images may be displayed sequentially to the user such that the mouth and/or eyes dynamically change. Each of the still images may be associated with a rating value.

Referring to FIG. 9, one method of assessing opinions or feelings is described. The assessment system displays 910 the computer-generated face image including the variable facial expression. Initially, the computer-generated face image may be displayed with a neutral expression (e.g., face image 710d in FIG. 7). The subject being assessed may be prompted 912 to express an opinion or feeling on a particular matter (e.g., an item, activity or concept). As described above, the assessment system or an individual performing the assessment may prompt the subject. One example of a prompt is a single verbal query, such as “How does the bump on your head make you feel?” or “How does playing with a dog make you feel?” Another example of a prompt is a nonverbal auditory event (e.g., a barking dog, thunder). A further example of a prompt is a photograph or drawing (e.g., a picture of a person, a toy, or a food type). Yet another example of a prompt is a changing display, such as a video and/or audio depiction of an event that unfolds over time.

The assessment system may then receive 914 a user input signal provided by the subject based on the subject's opinion or feeling. If the subject feels more positive than the computer-generated face displayed initially, for example, the subject may provide a positive input signal (e.g., using a control with an up arrow). If the subject feels more negative than the computer-generated face displayed initially, the subject may provide a negative input signal (e.g., using a control with a down arrow). In response to the user input signal, the variable facial expression on the computer-generated face changes 916. The user input signal may be received repeatedly and the facial expression may change dynamically as the user makes adjustments until the facial expression of the computer-generated face corresponds to the subject's opinion or feeling on a particular matter. The system thus allows the subject to adjust the expression of the face and to continue making adjustments until satisfied that the expression (e.g., one of the expressions shown in FIG. 7) represents their opinions or feelings about a particular matter (e.g., item, activity, concept).

The assessment system may also store or record 918 a rating value associated with the displayed facial expression and thus the subject's opinion or feeling. The rating value may be stored or written to a data file for later analysis. The data file may be given a code name for each user or subject for whom ratings are recorded. The rating values may be recorded together with information identifying the visual and/or audible representation prompting the assessment. If multiple stimulus conditions are employed in the same test trial (e.g., feelings about each of a series of pictures are expressed), for example, each rating value and its associated picture label may be stored in the file for later analysis.

In addition to storing a final rating value, rating values may be recorded in discrete steps as the subject changes the facial expression to permit the rating process to be reviewed and evaluated by an investigator at a later point in time. In addition, each keystroke or user input can be stored in a file, as well as the time associated with each input. The assessment system and method may thus recreate the dynamic changes leading up to each final rating and the time required to make each change in a rating. Such information may prove valuable when attempting to understand the process used by the subject in making the assessment.

According to one application of the system, illustrated in FIGS. 10A-10C, a child may express opinions or feelings (e.g., likes and dislikes) about the desirability of different toys. A series of toys may be presented singly, for example, in an image adjacent to the computer-generated face image. Alternatively, the actual toys may be presented to the child. The child may adjust the facial expression (e.g., the shape of the mouth and simultaneously the eyes) to indicate their level of like or dislike for the toy. A toy manufacturer could use this method to pre-test the desirability of different products from the standpoint of the child before deciding upon which toys should be marketed to the public. Those toys that receive high positive ratings from a group of children might be good candidates for the marketplace. Those toys that receive negative ratings might not be marketed or might be withdrawn from the market in the event they had already been offered for sale. In the present example, the child clearly likes the “teddy bear” (FIG. 10A), likes the “robot” less (FIG. 10B), and does not like the “puppet” (FIG. 10C).

According to another application of the system, illustrated in FIGS. 11A-11C, a child may express opinions or feelings (e.g., pleasure and displeasure) regarding different aspects of his or her hospital experience as a patient. A series of hospital scenes (e.g., a play room, food service, and needle injection) may be presented singly, for example, in cartoon images next to the computer-generated face image. The child may adjust the facial expression (e.g., mouth and eyes) to express a level of pleasure or displeasure with the scene depicted in the cartoon. In the present example, the child expresses a strong positive opinion about the scene where the little girl is playing with the therapy dog (FIG. 11A), a weaker positive opinion about a hospital meal (FIG. 11B), and a very negative opinion about receiving a needle injection (FIG. 11C). Other cartoons might depict the child's room in the hospital, an attending doctor or nurse, the “X-ray room” or the “operating room.”

An assessment may thus be obtained from the child without requiring an intervention by adult relatives, guardians or friends who may not be able to accurately determine the nature (positive or negative valence) of the child's experiences. Information regarding patient opinions could assist a hospital staff in making improvements in their service and in anticipating problems they might encounter in accommodating younger patients. It has become increasingly important for hospitals to assess the opinions of their patients about their treatment and care during hospital stays. Embodiments of the present invention may allow a hospital to acquire information pertaining to different aspects of their service and facilities, as seen through the eyes of the young patient.

According to another application, shown in FIG. 12, a child may express opinions or feelings about a topic or activity depicted in a video displayed (e.g., on a computer or television monitor) beside the face image. Sound (e.g., voices, music, etc.) can also accompany the visual display. As the video is played, the child makes adjustments of the facial expression to indicate pleasure or displeasure about the contents of the video. The child may adjust the facial expression of the face over time in order to represent a continuous sequence of varying opinions regarding scenes that are portrayed in the video. Because every user input (e.g., key stroke) leads to a change in the rating, it is possible to record ratings over the course of whatever events are depicted in the video.

In one application, the video may depict a short story of a child entering the hospital, being admitted, arriving at a patient room, having X-rays taken, lying sick in bed, and then later being discharged. A child patient being tested at the computer may be asked to express how they felt throughout the video as different scenes were played out. The child may change the facial expression according to how various parts of the story made him/her feel. This information could be used by hospital personnel to help identify those aspects of a child's hospital stay that were judged favorably and unfavorably.

In another application, a television producer may wish to pretest an educational video program intended for children by determining which parts of a storyline were well received and which parts were not well received. An iterative procedure may be followed whereby the story is shown and a group of children are tested using the computerized assessment system. After analyzing the data, the producer may wish to modify or omit the scenes that did not elicit the intended emotional response. This may be followed by tests on a new group of children. This procedure could continue until the children expressed the desired sequence of emotions over the course of the story, which met the goals of either the producer or of those individuals producing the video (e.g., parents or educators). Such information regarding children's opinions could assist television producers or educators in creating favorably-rated programs for children. This particular type of dynamic judgment over time may be cumbersome or impossible to implement if the child had to make repeated selections from a static, photographic series.

Although the exemplary embodiments describe a use for assessing children's opinions or feeling, the computerized assessment system and method may also be used to assess opinions or feelings of adult subjects. The computerized assessment system and method may be particularly useful for adults who are disabled or who cannot read. Those skilled in the art will also recognize numerous other applications for the computerized assessment system and method.

In summary, embodiments of the present invention include a computerized system and method for assessing opinions or feelings of a subject. Consistent with one embodiment of the invention, the computerized method includes displaying at least one computer-generated face image having a variable facial expression capable of changing dynamically to correspond to opinions or feelings of the subject. The method also includes receiving at least one user input signal for changing the variable facial expression in accordance with opinions or feelings of the subject and displaying changes in the variable facial expression of the computer-generated face image in response to the user input signal. The variable facial expression changes dynamically until a selected facial expression is displayed.

The computerized assessment system includes a display configured to display at least one computer-generated face image having a variable facial expression capable of changing dynamically to correspond to opinions or feelings of a subject. The system also includes a user input device configured to generate at least one user input signal for changing the variable facial expression in accordance with opinions or feelings of the subject. A computer is configured to change the variable facial expression of the computer-generated face image on the display in response to the user input signal.

While the principles of the invention have been described herein, it is to be understood by those skilled in the art that this description is made only by way of example and not as a limitation as to the scope of the invention. Other embodiments are contemplated within the scope of the present invention in addition to the exemplary embodiments shown and described herein. Modifications and substitutions by one of ordinary skill in the art are considered to be within the scope of the present invention, which is not to be limited except by the following claims.

Claims

1. A computerized method for assessing opinions or feelings of a subject, said method comprising:

displaying at least one computer-generated face image, said computer-generated face image having a variable facial expression capable of changing dynamically to correspond to opinions or feelings of said subject;
receiving at least one user input signal for changing said variable facial expression in accordance with opinions or feelings of said subject; and
displaying changes in said variable facial expression of said at least one computer-generated face image in response to said user input signal, wherein said variable facial expression changes dynamically until a selected facial expression is displayed.

2. The computerized method of claim 1 wherein a range of rating values are associated with said variable facial expression, and further comprising storing at least one rating value associated with at least one selected facial expression.

3. The computerized method of claim 1 wherein said at least one computer-generated face image includes a mouth, and wherein a curvature of said mouth is capable of being changed to provide said variable facial expression.

4. The computerized method of claim 3 wherein said at least one computer-generated face image includes eyes, and wherein said eyes are capable of opening and closing to provide said variable facial expression.

5. The computerized method of claim 1 further comprising prompting said subject to express an opinion or feeling about a matter.

6. The computerized method of claim 5 wherein prompting said subject includes displaying a visual representation of said matter in proximity to said computer-generated face image.

7. The computerized method of claim 6 wherein said visual representation includes at least one picture.

8. The computerized method of claim 6 wherein said visual representation includes at least one video.

9. The computerized method of claim 5 wherein prompting said subject includes playing an auditory representation of said matter about which said subject has an opinion or feeling.

10. The computerized method of claim 9 wherein said auditory representation includes at least one verbal message.

11. The computerized method of claim 9 wherein said auditory representation includes at least one nonverbal auditory event.

12. The computerized method of claim 5 wherein prompting the subject includes providing an auditory query regarding said matter about which the subject has an opinion or feeling.

13. The computerized method of claim 5 wherein prompting said subject includes asking said subject to express feelings of pain felt by said subject.

14. A machine-readable medium whose contents cause a computer system to perform a method for assessing opinions or feelings of a subject, said method comprising:

displaying at least one computer-generated face image, said computer-generated face image having a variable facial expression capable of changing dynamically to correspond to opinions or feelings of said subject;
receiving at least one user input signal for changing said variable facial expression in accordance with opinions or feelings of said subject; and
displaying changes in said variable facial expression of said at least one computer-generated face image in response to said user input signal, wherein said variable facial expression changes dynamically until a selected facial expression is displayed.

15. The machine-readable medium of claim 14 wherein said method further comprises prompting said subject to express an opinion or feeling about a matter.

16. The machine-readable medium of claim 14 wherein said at least one computer-generated face image includes a mouth, and wherein a curvature of said mouth is capable of being changed to provide said variable facial expression.

17. The machine-readable medium of claim 16 wherein said at least one computer-generated face image includes eyes, and wherein said eyes are capable of opening and closing to provide said variable facial expression.

18. A computerized assessment system for assessing opinions or feelings of a subject, said system comprising:

a display configured to display at least one computer-generated face image having a variable facial expression capable of changing dynamically to correspond to opinions or feelings of a subject;
a user input device configured to generate at least one user input signal for changing said variable facial expression in accordance with opinions or feelings of said subject; and
a computer configured to change said variable facial expression of said computer-generated face image on said display in response to said user input signal.

19. The computerized assessment system of claim 18 wherein said at least one computer-generated face image includes a mouth, and wherein a curvature of said mouth is capable of being changed to provide said variable facial expression.

20. The computerized assessment system of claim 18 wherein said at least one computer-generated face image includes eyes, and wherein said eyes are capable of opening and closing to provide said variable facial expression.

Patent History
Publication number: 20060128263
Type: Application
Filed: Dec 8, 2005
Publication Date: Jun 15, 2006
Inventor: John Baird (Waterbury, VT)
Application Number: 11/297,498
Classifications
Current U.S. Class: 446/321.000
International Classification: A63H 3/12 (20060101);