Patents Assigned to TOBII AB
-
Publication number: 20190102905Abstract: Head pose information may be determined using information describing a fixed gaze and image data corresponding to a user's eyes. The head pose information may be determined in a manner that is disregards facial features with the exception of the user's eyes. The head pose information may be useable to interact with a user device.Type: ApplicationFiled: September 28, 2018Publication date: April 4, 2019Applicant: Tobii ABInventor: Mårten Skogö
-
Publication number: 20190094963Abstract: According to the invention, techniques for refining a ballistic prediction are described. In an example, an eye tracking system may record images over time of content presented on a display. Saccade data may be received and used as a trigger to retrieve particular ones of the recoded images. The eye tracking system may compare the images to identify a change in the content. The location of this change may correspond to a sub-area of the display. The output of the ballistic prediction may include a landing point that represents an anticipated gaze point. This landing point may be adjusted such that a gaze point is now predicted to fall within the sub-area when the change is significant.Type: ApplicationFiled: November 26, 2018Publication date: March 28, 2019Applicant: Tobii ABInventor: Daan Nijs
-
Publication number: 20190086999Abstract: Techniques for interacting with a first computing device based on gaze information are described. In an example, the first computing device captures a gaze direction of a first user of the first computing device by using an eye tracking device. The first computing device displays a representation of a second user on a display of the first computing device. Further, the first computing device receives from the first user, communication data generated by an input device. The first computing device determines if the gaze direction of the first user is directed to the representation of the second user. If the gaze direction of the first user is directed to the representation of the second user, the first computing device transmits the communication data to a second computing device of the second user.Type: ApplicationFiled: September 19, 2018Publication date: March 21, 2019Applicant: Tobii ABInventors: Daniel Ricknäs, Erland George-Svahn, Rebecka Lannsjö, Andrew Ratcliff, Regimantas Vegele, Geoffrey Cooper, Niklas Blomqvist
-
Publication number: 20190087000Abstract: Techniques for interacting with a first computing device based on gaze information are described. In an example, the first computing device captures a gaze direction of a first user of the first computing device by using an eye tracking device. The first computing device determines if the gaze direction of the first user is directed to a first display. Further, the first computing device receives information regarding if a gaze direction of a second user is directed to a second display. If the gaze direction of the first user is directed to the first display and the gaze direction of the second user is directed to the second display, the first computing device continuously updates content on the first display. If the gaze direction of the second user is not directed to the second display, the first computing device pauses the content on the first display.Type: ApplicationFiled: September 19, 2018Publication date: March 21, 2019Applicant: Tobii ABInventors: Daniel Ricknäs, Erland George-Svahn, Rebecka Lannsjö, Andrew Ratcliff, Regimantas Vegele, Geoffrey Cooper, Niklas Blomqvist
-
Publication number: 20190076014Abstract: A method is disclosed, comprising obtaining a first angular offset between a first eye direction and a first gaze direction of an eye having a first pupil size, obtaining a second angular offset between a second eye direction and a second gaze direction of the eye having a second pupil size, and forming, based on the first angular offset and the second angular offset, a compensation model describing an estimated angular offset as a function of pupil size. A system and a device comprising a circuitry configured to perform such a method are also disclosed.Type: ApplicationFiled: September 7, 2018Publication date: March 14, 2019Applicant: Tobii ABInventors: Mark Ryan, Simon Johansson, Erik Lindén
-
Publication number: 20190076015Abstract: An eye tracking system having circuitry configured to perform a method is disclosed. An estimated radius (r) from an eyeball center to a pupil center in an eye is obtained, and an estimated eyeball center position (e) in the eye in relation to an image sensor for capturing images of the eye is determined, and an image of the eye captured by means of the image sensor, and a position of a representation of the pupil center in the eye in the obtained image is identified. An estimated pupil center position (p?) is then determined based on the estimated eyeball center position (e), the estimated radius (r), and the identified position of the representation of the pupil center in the obtained image.Type: ApplicationFiled: September 7, 2018Publication date: March 14, 2019Applicant: Tobii ABInventors: Simon Johansson, Mark Ryan
-
Patent number: 10228763Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: GrantFiled: September 12, 2018Date of Patent: March 12, 2019Assignee: Tobii ABInventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Publication number: 20190073026Abstract: According to the invention, a method for entering text into a computing device using gaze input from a user is disclosed. The method may include causing a display device to display a visual representation of a plurality of letters. The method may also include receiving gaze information identifying a movement of the user's gaze on the visual representation. The method may further include recording an observation sequence of one or more observation events that occur during the movement of the user's gaze on the visual representation. The method may additionally include providing the observation sequence to a decoder module. The decoder module may determine at least one word from the observation sequence representing an estimate of an intended text of the user.Type: ApplicationFiled: August 22, 2018Publication date: March 7, 2019Applicant: Tobii ABInventors: Per Ola Kristensson, Keith Vertanen, Morten Mjelde
-
Publication number: 20190064513Abstract: According to the invention, a system for tracking a gaze of a user across a multi-display arrangement is disclosed. The system may include a first display, a first eye tracking device, a second display, a second eye tracking device, and a processor. The first eye tracking device may be configured to determine a user's gaze direction while the user is gazing at the first display. The second eye tracking device may be configured to determine the user's gaze direction while the user is gazing at the second display. The processor may be configured to determine that the user's gaze has moved away from the first display in a direction of the second display, and in response to determining that the user's gaze has moved away from the first display in the direction of the second display, deactivate the first eye tracking device, and activate the second eye tracking device.Type: ApplicationFiled: August 31, 2017Publication date: February 28, 2019Applicant: Tobii ABInventors: Farshid Bagherpour, Mårten Skogö
-
Patent number: 10216994Abstract: A method for panning content on a display of a wearable device is disclosed. The method may include determining, via an eye tracking device, a gaze direction of a user. The method may also include determining, via a movement detection system, a head direction of the user. The method may further include, based at least in part on the gaze direction and the head direction both being consistent with a particular direction, causing content displayed on a display of the wearable device to be panned in the particular direction. The method may additionally include determining during panning of the content, via the eye tracking device, that the gaze direction of the user has returned to a neutral position. The method may moreover include, based at least in part on the gaze direction of the user returning to the neutral position, causing content displayed on the display to stop panning.Type: GrantFiled: November 20, 2017Date of Patent: February 26, 2019Assignee: Tobii ABInventors: Simon Gustafsson, Henrik Björk, Fredrik Lindh, Anders Olsson
-
Patent number: 10216268Abstract: A method for determining if a user's gaze is directed in the direction of a zone of interest in a 3D scene comprises: providing a 3D scene containing a zone of interest; associating a property with the zone of interest; creating a bitmap representing the location of the zone of interest in a projected view of the 3D scene, each pixel of the bitmap to which the zone of interest is projected storing the property of the zone of interest; detecting the direction of the user's gaze; using the bitmap to determine if the detected user's gaze is directed in the direction of the zone of interest.Type: GrantFiled: June 2, 2016Date of Patent: February 26, 2019Assignee: Tobii ABInventors: Fredrik Lindh, Mattias Gustavsson, Anders Vennstrom, Andreas Edling
-
Patent number: 10212343Abstract: An imaging device adapted to provide eye-tracking data by imaging at least one eye of a viewer, wherein: the imaging device is switchable between at least an active mode and a ready mode; the imaging device is configured to use active eye illumination in the active mode, which enables tracking of a corneal reflection; and the imaging device is configured, in the ready mode, to reduce an illumination intensity from a value the illumination intensity has in the active mode, and to provide eye-tracking data which include eye position but not eye orientation.Type: GrantFiled: July 6, 2016Date of Patent: February 19, 2019Assignee: Tobii ABInventors: Henrik Eskilsson, Mårten Skogö, John Elvesjö, Peter Blixt
-
Publication number: 20190050644Abstract: A method for panning content on a display of a wearable device is disclosed. The method may include determining, via an eye tracking device, a gaze direction of a user. The method may also include determining, via a movement detection system, a head direction of the user. The method may further include, based at least in part on the gaze direction and the head direction both being consistent with a particular direction, causing content displayed on a display of the wearable device to be panned in the particular direction. The method may additionally include determining during panning of the content, via the eye tracking device, that the gaze direction of the user has returned to a neutral position. The method may moreover include, based at least in part on the gaze direction of the user returning to the neutral position, causing content displayed on the display to stop panning.Type: ApplicationFiled: October 16, 2018Publication date: February 14, 2019Applicant: Tobii ABInventors: Simon Gustafsson, Henrik Björk, Fredrik Lindh, Anders Olsson
-
Patent number: 10203758Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.Type: GrantFiled: August 6, 2013Date of Patent: February 12, 2019Assignee: Tobii ABInventors: Christoffer Björklund, Henrik Eskilsson, Magnus Jacobson, Mårten Skogö
-
Patent number: 10198070Abstract: A gaze tracking system, leaving a low power mode in response to an activation signal, captures an initial burst of eye pictures in short time by restricting the image area of a sensor, with the purpose of enabling an increased frame rate. Subsequent eye pictures are captured at normal frame rate. The first gaze point value is computed memorylessly based on the initial burst of eye pictures and no additional imagery, while subsequent values may be computed recursively by taking into account previous gaze point values or information from previous eye pictures. The restriction of the image area may be guided by a preliminary overview picture captured using the same or a different sensor. From the gaze point values, the system may derive a control signal to be supplied to a computer device with a visual display.Type: GrantFiled: September 18, 2017Date of Patent: February 5, 2019Assignee: Tobii ABInventors: Mårten Skogö, Anders Olsson, John Mikael Elvesjö, Aron Yu
-
Patent number: 10192109Abstract: According to the invention a system for authenticating a user of a device is disclosed. The system may include a first image sensor, a determination unit, and an authentication unit. The first image sensor may be for capturing at least one image of at least part of a user. The determination unit may be for determining information relating to the user's eye based at least in part on at least one image captured by the first image sensor. The authentication unit may be for authenticating the user using the information relating to the user's eye.Type: GrantFiled: April 18, 2016Date of Patent: January 29, 2019Assignee: Tobii ABInventors: Mårten Skogö, Richard Hainzl, Henrik Jönsson, Anders Vennström, Erland George-Svahn, John Elvesjö
-
Patent number: 10191542Abstract: A visual display includes hidden reference illuminators adapted to emit invisible light for generating corneo-scleral reflections on an eye watching a screen surface of the display. The tracking of such reflections and the pupil center provides input to gaze tracking. A method for equipping and an LCD with a reference illuminator are also provided. Also provides are a system and method for determining a gaze point of an eye watching a visual display that includes reference illuminators. The determination of the gaze point may be based on an ellipsoidal cornea model.Type: GrantFiled: April 10, 2017Date of Patent: January 29, 2019Assignee: Tobii ABInventors: Peter Blixt, Anders Kingback, Bengt Rehnstrom, John Elvesjö, Mattias Kuldkepp
-
Publication number: 20190026369Abstract: The invention generally relates to computer implemented systems and methods for utilizing detection of eye movements in connection with interactive graphical user interfaces and, in particular, for utilizing eye tracking data to facilitate and improve information search and presentation in connection with interactive graphical user interfaces. A gaze search index is determined based on information that has been presented on an information presentation area and gaze data signals. The gaze search index comprises links between gaze search parameters and presented information that satisfies gaze filter criteria for respective gaze search parameter. A user can initiate query searches for information on the computer device or on information hosts connectable to the computer device via networks by using combinations of gaze search parameters of the gaze search index and non gaze search parameters of a non gaze search index.Type: ApplicationFiled: August 1, 2018Publication date: January 24, 2019Applicant: Tobii ABInventors: Anders Olsson, Mårten Skogö
-
Publication number: 20190025932Abstract: A portable computing device is disclosed which may include a base element, a lid element, a first motion sensor, a second motion sensor, a processor, and an eye tracking system. The first motion sensor may be disposed in the base element. The second motion sensor may be disposed in the lid element. The processor may be configured to control the first motion sensor to detect first motion information, control the second motion sensor to detect second motion information; and determine final motion information based at least in part on the first motion information and the second motion information. The eye tracking system may be configured to determine a gaze position of a user based at least in part on the final motion information, wherein the processor is further configured to execute one or more control processes based at least in part on the determined gaze position meeting a predetermined condition.Type: ApplicationFiled: July 30, 2018Publication date: January 24, 2019Applicant: Tobii ABInventors: Mårten Skogö, John Elvesjö, Jan-Erik Lundkvist, Per Lundberg
-
Patent number: 10185394Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: GrantFiled: December 11, 2017Date of Patent: January 22, 2019Assignee: Tobii ABInventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong