Patents Assigned to TOBII AB
-
Publication number: 20210042520Abstract: There is disclosed a computer implemented eye tracking system and corresponding method and computer readable storage medium, for detecting three dimensional, 3D, gaze, by obtaining at least one head pose parameter using a head pose prediction algorithm, the head pose parameter(s) comprising one or more of a head position, pitch, yaw, or roll; and to input the at least one head pose parameter along with at least one image of a user's eye, generated from a 2D image captured using an image sensor associated with the eye tracking system, into a neural network configured to generate 3D gaze information based on the at least one head pose parameter and the at least one eye image.Type: ApplicationFiled: June 15, 2020Publication date: February 11, 2021Applicant: Tobii ABInventors: David Molin, Tommaso Martini, Maria Gordon, Alexander Davies, Oscar Danielsson
-
Publication number: 20210042015Abstract: Disclosed is a method for interacting with a selectable object displayed by means of a displaying device, the method comprising the steps of: obtaining a gaze convergence distance and a gaze direction of a user, the gaze direction lying in a field of view defined by the displaying device; determining whether the gaze direction coincides with the selectable object; and if so, detecting a change in the gaze convergence distance; and if the detected change in the gaze convergence distance exceeds a predetermined threshold value interacting with the selectable object.Type: ApplicationFiled: May 15, 2020Publication date: February 11, 2021Applicant: Tobii ABInventor: Andrew Ratcliff
-
Publication number: 20210034152Abstract: A method and a corresponding apparatus for mitigating motion sickness in a virtual reality VR/augmented reality (AR) system using a head mounted display (HMD) are disclosed. The method comprises receiving data from a sensor indicating a current orientation of the HMD in real space, and superimposing a visual indication on a display of the HMD. The visual indication provides visual information to a user of the current orientation of the HMD in real space. Furthermore, methods and corresponding apparatuses are disclosed of calculating gaze convergence distance in an eye tracking system, and of gaze based virtual reality (VR)/augmented reality (AR) menu expansion.Type: ApplicationFiled: October 19, 2020Publication date: February 4, 2021Applicant: Tobii ABInventors: Andreas Klingström, Per Fogelström, Andrew Ratcliff
-
Patent number: 10895908Abstract: According to the invention, techniques for refining a ballistic prediction are described. In an example, an eye tracking system may record images over time of content presented on a display. Saccade data may be received and used as a trigger to retrieve particular ones of the recoded images. The eye tracking system may compare the images to identify a change in the content. The location of this change may correspond to a sub-area of the display. The output of the ballistic prediction may include a landing point that represents an anticipated gaze point. This landing point may be adjusted such that a gaze point is now predicted to fall within the sub-area when the change is significant.Type: GrantFiled: November 26, 2018Date of Patent: January 19, 2021Assignee: Tobii ABInventor: Daan Nijs
-
Patent number: 10895909Abstract: According to the invention, a system for presenting graphics on a display device is disclosed. The system may include an eye tracking device configured to determine a gaze point of a user on a display device. The system may also include a processing device configured to combine a quality map generated based on the gaze point of the user and another quality map generated based on factors independent of the gaze point of the user to generate a combined quality map. The processing device is further configured to cause a rendered image to be displayed on the display device and the quality of each area of the rendered image displayed on the display device is determined based, at least in part, upon the combined quality map.Type: GrantFiled: December 3, 2018Date of Patent: January 19, 2021Assignee: Tobii ABInventors: Daan Nijs, Robin Thunström
-
Publication number: 20210011682Abstract: According to the invention, a method for providing audio to a user is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user on a display. The method may also include causing, with a computer system, an audio device to produce audio to the user, where content of the audio may be based at least in part on the gaze point of the user on the display.Type: ApplicationFiled: April 28, 2020Publication date: January 14, 2021Applicant: Tobii ABInventors: Anders Vennström, Fredrik Lindh
-
Publication number: 20210012157Abstract: A method for training an eye tracking model is disclosed, as well as a corresponding system and storage medium. The eye tracking model is adapted to predict eye tracking data based on sensor data from a first eye tracking sensor. The method comprises receiving sensor data obtained by the first eye tracking sensor at a time instance and receiving reference eye tracking data for the time instance generated by an eye tracking system comprising a second eye tracking sensor. The reference eye tracking data is generated by the eye tracking system based on sensor data obtained by the second eye tracking sensor at the time instance. The method comprises training the eye tracking model based on the sensor data obtained by the first eye tracking sensor at the time instance and the generated reference eye tracking data.Type: ApplicationFiled: March 30, 2020Publication date: January 14, 2021Applicant: Tobii ABInventors: Carl Asplund, Patrik Barkman, Anders Dahl, Oscar Danielsson, Tommaso Martini, Mårten Nilsson
-
Publication number: 20210011548Abstract: A gaze tracking system, leaving a low power mode in response to an activation signal, initial burst of eye pictures in short time by restricting the image area of a sensor, purpose of enabling an increased frame rate. Subsequent eye pictures are captured at le rate. The first gaze point value is computed memorylessly based on the initial burst res and no additional imagery, while subsequent values may be computed recursively to account previous gaze point values or information from previous eye pictures. The restriction of the image area may be guided by a preliminary overview picture captured using the different sensor. From the gaze point values, the system may derive a control signal to a computer device with a visual display.Type: ApplicationFiled: February 20, 2020Publication date: January 14, 2021Applicant: Tobii ABInventors: Mårten Skogö, Anders Olsson, John Mikael Elvesjö, Aron Yu
-
Publication number: 20210012559Abstract: A method for determining a focus target of a user's gaze in a three-dimensional (“3D”) scene is disclosed. The method may include determining a first gaze direction of a user into a 3D scene, where the 3D scene includes a plurality of components. The method may also include executing a first plurality of line traces in the 3D scene, where each of the first plurality of line traces is in proximity to the first gaze direction. The method may further include determining a confidence value for each component intersected by at least one of the first plurality of line traces. The method may additionally include identifying as a focus target of the user the component having the highest confidence value of all components intersected by at least one of the first plurality of line traces.Type: ApplicationFiled: February 20, 2020Publication date: January 14, 2021Applicant: Tobii ABInventor: Fredrik Lindh
-
Publication number: 20210012105Abstract: There is provided a method, system, and non-transitory computer-readable storage medium for performing three-dimensional, 3D, position estimation for the cornea center of an eye of a user, using a remote eye tracking system, wherein the position estimation is reliable and robust also when the cornea center moves over time in relation to an imaging device associated with the eye tracking system. This is accomplished by generating, using, and optionally also updating, a cornea movement filter, CMF, in the cornea center position estimation.Type: ApplicationFiled: June 29, 2020Publication date: January 14, 2021Applicant: Tobii ABInventors: David Masko, Magnus Ivarsson, Niklas Ollesson, Anna Redz
-
Publication number: 20210014442Abstract: Techniques for reducing a read out time and power consumption of an image sensor used for eye tracking are described. In an example, a position of an eye element in an active area of a sensor is determined. The eye element can be any of an eye, a pupil of the eye, an iris of the eye, or a glint at the eye. A region of interest (ROI) around the position of the eye is defined. The image sensor reads out pixels confined to the ROI, thereby generating an ROI image that shows the eye element.Type: ApplicationFiled: February 20, 2020Publication date: January 14, 2021Applicant: Tobii ABInventors: Magnus Ivarsson, Per-Edvin Stoltz, David Masko, Niklas Ollesson, Mårten Skogö, Peter Blixt, Henrik Jönsson
-
Publication number: 20210011550Abstract: The disclosure relates to a method performed by a computer for identifying a space that a user of a gaze tracking system is viewing, the method comprising obtaining gaze tracking sensor data, generating gaze data comprising a probability distribution using the sensor data by processing the sensor data by a trained model and identifying a space that the user is viewing using the probability distribution.Type: ApplicationFiled: June 15, 2020Publication date: January 14, 2021Applicant: Tobii ABInventors: Patrik Barkman, Anders Dahl, Oscar Danielsson, Tommaso Martini, Mårten Nilsson
-
Publication number: 20210011549Abstract: A method of updating a cornea model for a cornea of an eye is disclosed, as well as a corresponding system and storage medium. The method comprises controlling a display to display a stimulus at a first depth, wherein the display is capable of displaying objects at different depths, receiving first sensor data obtained by an eye tracking sensor while the stimulus is displayed at the first depth by the display, controlling the display to display a stimulus at a second depth, wherein the second depth is different than the first depth, receiving second sensor data obtained by the eye tracking sensor while the stimulus is displayed at the second depth by the display, and updating the cornea model based on the first sensor data and the second sensor data.Type: ApplicationFiled: March 30, 2020Publication date: January 14, 2021Applicant: Tobii ABInventors: Mark Ryan, Jonas Sjöstrand, Erik Lindén, Pravin Rana
-
Publication number: 20210014443Abstract: A computer implemented method for controlling read-out from a digital image sensor device, comprising a plurality of pixels, the method comprising the steps of setting a first read-out scheme, based on a first level of pixel binning and/or pixel skipping, reading, based on the first read-out scheme, from the digital image sensor device, a first image, determining an exposure value for the first image, based on the intensity value of each one of the first plurality of regions of the first image and comparing the exposure value with a predetermined maximum value. A second read-out scheme based on a second level of pixel binning and/or pixel skipping is set. The level of pixel binning and/or pixel skipping in the second read-out scheme is increased compared to the first read-out scheme, if the exposure value is higher than the predetermined maximum value. Based on the second read-out scheme, a subsequent second image is read. A system configured to perform the method is also described.Type: ApplicationFiled: June 19, 2020Publication date: January 14, 2021Applicant: Tobii ABInventors: Niklas Ollesson, Magnus Ivarsson, Viktor Åberg, Anna Redz
-
Publication number: 20210011551Abstract: An eyetracker obtains a digital image representing at least one eye of a subject. The eyetracker then searches for pupil candidates in the digital image according to a search algorithm and, based on the searching, determines a position for the at least one eye in the digital image. The eyetracker also obtains light-intensity information expressing an estimated amount of light energy exposing the at least one eye when registering the digital image. In response to the light-intensity information, the eye-tracker determines a range of pupil sizes. The search algorithm applies the range of pupil sizes in such a manner that a detected pupil candidate must have size within the range of pupil sizes to be accepted by the search algorithm as a valid pupil of the subject.Type: ApplicationFiled: June 26, 2020Publication date: January 14, 2021Applicant: Tobii ABInventor: Richard Andersson
-
Publication number: 20210012161Abstract: Techniques for generating 3D gaze predictions based on a deep learning system are described. In an example, the deep learning system includes a neural network. The neural network is trained with training images generated by cameras and showing eyes of user while gazing at stimulus points. Some of the stimulus points are in the planes of the camera. Remaining stimulus points are not un the planes of the cameras. The training includes inputting a first training image associated with a stimulus point in a camera plane and inputting a second training image associated with a stimulus point outside the camera plane. The training minimizes a loss function of the neural network based on a distance between at least one of the stimulus points and a gaze line.Type: ApplicationFiled: June 2, 2020Publication date: January 14, 2021Applicant: Tobii ABInventor: Erik Linden
-
Publication number: 20210004623Abstract: There is provided a method, system, and non-transitory computer-readable storage medium for controlling an eye tracking system to optimize eye tracking performance under different lighting conditions, by obtaining a first image captured using a camera associated with the eye tracking system, the first image comprising at least part of an iris and at least part of a pupil of an eye illuminated by an illuminator associated with the eye tracking system at a current power of illumination selected from a set of predetermined power levels; determining a contrast value between an the iris and the pupil in the image; and, if the contrast value deviates less than a preset deviation threshold value from a preset minimum contrast value, setting the current power of illumination of the illuminator to the other predetermined power level in the set of predetermined power levels.Type: ApplicationFiled: June 15, 2020Publication date: January 7, 2021Applicant: Tobii ABInventors: Viktor Åberg, Anna Redz, Niklas Ollesson, Dineshkumar Muthusamy, Magnus Ivarsson
-
Patent number: 10885882Abstract: According to the invention, a method for reducing aliasing artifacts in foveated rendering is disclosed. The method may include accessing a high resolution image and a low resolution image corresponding to the high resolution image, and calculating a difference between a pixel of the high resolution image and a sample associated with the low resolution image. The sample of the low resolution image corresponds to the pixel of the high resolution image. The method may further include modifying the pixel to generate a modified pixel of the high resolution image based on determining that the difference is higher than or equal to a threshold value. The modification may be made such that an updated difference between the modified pixel and the sample is smaller than the original difference.Type: GrantFiled: December 6, 2018Date of Patent: January 5, 2021Assignee: Tobii ABInventors: Daan Pieter Nijs, Fredrik Lindh
-
Patent number: 10884491Abstract: According to the invention, a method for changing information on a display in a vehicle based on a gaze direction of a driver is disclosed. The method may include displaying information on the display in the vehicle. The method may also include receiving gaze data indicative of the gaze direction of a user. The method may further include changing the display based at least in part on the gaze data.Type: GrantFiled: May 1, 2019Date of Patent: January 5, 2021Assignee: Tobii ABInventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20200394400Abstract: An eye tracking device for tracking an eye is described. The eye tracking device comprises: a first diffractive optical element, DOE, arranged in front of the eye, an image module, wherein the image module is configured to capture an image of the eye via the first DOE. The first DOE is adapted to direct a first portion of incident light reflected from the eye, towards the image module. The eye tracking device is characterized in that the first DOE is configured to provide a lens effect.Type: ApplicationFiled: March 30, 2020Publication date: December 17, 2020Applicant: Tobii ABInventor: Daniel Tornéus