Abstract: Systems and methods for assessing human reaction to a stimulus using computer vision are described herein. A computer can compare a first facial image with a second facial image to identify a region of a subject's face where an expressional repositioning is evident. Based on the expressional repositioning, the computer can determine an emotion exhibited by the subject.
Abstract: Various systems and techniques using facial coding for emotional interaction analysis are described herein. Machine-readable facial observations of a subject while the subject is exposed to a stimulus can be received. The machine readable observations can include a stimulus synchronization element. An emotional component of an emotional state of the subject can be determined based on the facial observations. The determination can include assigning a numerical weight to the emotional component. An emotional state to the stimulus synchronization event can be assigned based on the emotional component.
Abstract: Systems and techniques using observed emotional data are described herein. A sequence of visual observations of a subject can be received during execution of an application. An emotional state of the subject can be determined based on the sequence of visual observations. Execution of the application can be modified from a baseline execution using the emotional state.
Abstract: A method of assessing an individual through facial muscle activity and expressions includes receiving a visual recording stored on a computer-readable medium of an individual's non-verbal responses to a stimulus, the non-verbal response comprising facial expressions of the individual. The recording is accessed to automatically detect and record expressional repositioning of each of a plurality of selected facial features by conducting a computerized comparison of the facial position of each selected facial feature through sequential facial images. The contemporaneously detected and recorded expressional repositionings are automatically coded to an action unit, a combination of action units, and/or at least one emotion. The action unit, combination of action units, and/or at least one emotion are analyzed to assess one or more characteristics of the individual to develop a profile of the individual's personality in relation to the objective for which the individual is being assessed.
Abstract: Systems and techniques for emotional modeling of a subject are described herein. Emotional data of a subject during exposure to a stimulus can be received. The emotional data can be interpreted to produce a result. The result can include an emotional model of the subject. The result can be presented to the user.
Abstract: A method of assessing an individual through facial muscle activity and expressions includes receiving a visual recording stored on a computer-readable medium of an individual's non-verbal responses to a stimulus, the non-verbal response comprising facial expressions of the individual. The recording is accessed to automatically detect and record expressional repositioning of each of a plurality of selected facial features by conducting a computerized comparison of the facial position of each selected facial feature through sequential facial images. The contemporaneously detected and recorded expressional repositionings are automatically coded to an action unit, a combination of action units, and/or at least one emotion. The action unit, combination of action units, and/or at least one emotion are analyzed to assess one or more characteristics of the individual to develop a profile of the individual's personality in relation to the objective for which the individual is being assessed.
Abstract: Systems and techniques using observed emotional data are described herein. A sequence of visual observations of a subject can be received during execution of an application. An emotional state of the subject can be determined based on the sequence of visual observations. Execution of the application can be modified from a baseline execution using the emotional state.
Abstract: Various systems and techniques using facial coding for emotional interaction analysis are described herein. Machine-readable facial observations of a subject while the subject is exposed to a stimulus can be received. The machine readable observations can include a stimulus synchronization element. An emotional component of an emotional state of the subject can be determined based on the facial observations. The determination can include assigning a numerical weight to the emotional component. An emotional state to the stimulus synchronization event can be assigned based on the emotional component.
Abstract: The present disclosure relates to a method of assessing consumer reaction to a stimulus, comprising receiving a visual recording stored on a computer-readable medium of facial expressions of at least one human subject as the subject is exposed to a business stimulus so as to generate a chronological sequence of recorded facial images; accessing the computer-readable medium for automatically detecting and recording expressional repositioning of each of a plurality of selected facial features by conducting a computerized comparison of the facial position of each selected facial feature through sequential facial images; automatically coding contemporaneously detected and recorded expressional repositionings to at least a first action unit, wherein the action unit maps to a first set of one or more possible emotions expressed by the human subject; assigning a numerical weight to each of the one or more possible emotions of the first set based upon both the number of emotions in the set and the common emotions in
Abstract: A method of reporting consumer reaction to a stimulus and resultant report generated by (i) recording facial expressions and eye positions of a human subject while exposed to a stimulus throughout a time period, (ii) coding recorded facial expressions to emotions, and (iii) reporting recorded eye positions and coded emotions, along with an identification of the stimulus.