Abstract: A method is provided for device interaction and identification by detecting similar or synchronous movements of two or more electronic devices using two or more Movement Data from two or more involved electronic devices to interact with each other and detect when the movement or motion of involved devices corresponds with certain multi-device gestures or activities.
Abstract: A method is provided for device interaction and identification by detecting similar or synchronous movements of two or more electronic devices using two or more Movement Data from two or more involved electronic devices to interact with each other and detect when the movement or motion of involved devices corresponds with certain multi-device gestures or activities.
Abstract: The aim of the present invention is to provide low-power gesture control method for mobile and wearable devices for interacting target devices. Furthermore, this invention presents a take on solving the modality switching problems known from prior art, where one modality can be used to activate another.
Abstract: The aim of the present method is to provide an instant and easy-to-use solution to compensate for common problems in sensing and analyzing orientation data provided by devices such as hand-held electronic devices (for example smartphones, remote controls, tablets, wands, etc.), and preferably in wearable miniature devices (for example smart jewelries, smart watches, smart wristbands, smart rings, etc.). The present method wherein the drift on the orientation or the error on the user pointing direction of the device is correctly adjusted regardless of the management of the orientation data.
Abstract: The aim of the present invention is to provide a method to solve the common drift problems and 3D orientation errors related to the use of orientation data of a mobile or wearable device and target system and for using mobile or wearable device as a Human-Machine Interface (HMI) using Inertial Measurement Units (IMUs) and potentially other sensors (for example cameras and markers, radar systems) as input data to convert user's motion into an interaction, a pointer on the screen or a gesture. The contribution of this invention is a solution for well-known problems related to the use of IMUs and motion sensors as input devices for user interaction in general, as well as specific embodiments and application scenarios of these methods where wearable and/or mobile devices are used to control specific interfaces.
Abstract: Methods for instant, secure and easy to use interaction with a mobile or wearable device are disclosed. The method provides high recognition accuracy and enables implementation of numerous gestures.