Abstract: User interaction concepts, principles and algorithms for gestures involving facial expressions, motion or orientation of body parts, eye gaze, tightening muscles, mental activity, and other user actions are disclosed. User interaction concepts, principles and algorithms for enabling hands-free and voice-free interaction with electronic devices are disclosed. Apparatuses, systems, computer implementable methods, and non-transient computer storage media storing instructions, implementing the disclosed concepts, principles and algorithms are disclosed. Gestures for systems using eye gaze and head tracking that can be used with augmented, mixed or virtual reality, mobile or desktop computing are disclosed. Use of periods of limited activity and consecutive user actions in orthogonal axes is disclosed. Generation of command signals based on start and end triggers is disclosed. Methods for coarse as well as fine modification of objects are disclosed.
Abstract: User interaction concepts, principles and algorithms for gestures involving facial expressions, motion or orientation of body parts, eye gaze, tightening muscles, mental activity, and other user actions are disclosed. User interaction concepts, principles and algorithms for enabling hands-free and voice-free interaction with electronic devices are disclosed. Apparatuses, systems, computer implementable methods, and non-transient computer storage media storing instructions, implementing the disclosed concepts, principles and algorithms are disclosed. Gestures for systems using eye gaze and head tracking that can be used with augmented, mixed or virtual reality, mobile or desktop computing are disclosed. Use of periods of limited activity and consecutive user actions in orthogonal axes is disclosed. Generation of command signals based on start and end triggers is disclosed. Methods for coarse as well as fine modification of objects are disclosed.
Abstract: A hands-free controller, a facial expression management system, a drowsiness detection system and methods for using them are disclosed. The controller monitors facial expressions of the user, monitors motions of the user's body, generates commands for an electronic device based on the monitored facial expressions and body motions, and communicates the commands to the electronic device. Monitoring facial expressions can include sensing facial muscle motions using facial expression sensors. Monitoring user body motions can include sensing user head motions. Facial expression management can includes monitoring user facial expressions, storing monitored expressions, and communicating monitored expressions to an electronic device. Drowsiness detection can include monitoring eye opening of the user, generating an alert when drowsiness is detected, monitoring proper usage of the device, and generating a warning when improper usage is detected.