Patents Assigned to TOBII AB
-
Publication number: 20210012157Abstract: A method for training an eye tracking model is disclosed, as well as a corresponding system and storage medium. The eye tracking model is adapted to predict eye tracking data based on sensor data from a first eye tracking sensor. The method comprises receiving sensor data obtained by the first eye tracking sensor at a time instance and receiving reference eye tracking data for the time instance generated by an eye tracking system comprising a second eye tracking sensor. The reference eye tracking data is generated by the eye tracking system based on sensor data obtained by the second eye tracking sensor at the time instance. The method comprises training the eye tracking model based on the sensor data obtained by the first eye tracking sensor at the time instance and the generated reference eye tracking data.Type: ApplicationFiled: March 30, 2020Publication date: January 14, 2021Applicant: Tobii ABInventors: Carl Asplund, Patrik Barkman, Anders Dahl, Oscar Danielsson, Tommaso Martini, Mårten Nilsson
-
Publication number: 20210011548Abstract: A gaze tracking system, leaving a low power mode in response to an activation signal, initial burst of eye pictures in short time by restricting the image area of a sensor, purpose of enabling an increased frame rate. Subsequent eye pictures are captured at le rate. The first gaze point value is computed memorylessly based on the initial burst res and no additional imagery, while subsequent values may be computed recursively to account previous gaze point values or information from previous eye pictures. The restriction of the image area may be guided by a preliminary overview picture captured using the different sensor. From the gaze point values, the system may derive a control signal to a computer device with a visual display.Type: ApplicationFiled: February 20, 2020Publication date: January 14, 2021Applicant: Tobii ABInventors: Mårten Skogö, Anders Olsson, John Mikael Elvesjö, Aron Yu
-
Publication number: 20210014443Abstract: A computer implemented method for controlling read-out from a digital image sensor device, comprising a plurality of pixels, the method comprising the steps of setting a first read-out scheme, based on a first level of pixel binning and/or pixel skipping, reading, based on the first read-out scheme, from the digital image sensor device, a first image, determining an exposure value for the first image, based on the intensity value of each one of the first plurality of regions of the first image and comparing the exposure value with a predetermined maximum value. A second read-out scheme based on a second level of pixel binning and/or pixel skipping is set. The level of pixel binning and/or pixel skipping in the second read-out scheme is increased compared to the first read-out scheme, if the exposure value is higher than the predetermined maximum value. Based on the second read-out scheme, a subsequent second image is read. A system configured to perform the method is also described.Type: ApplicationFiled: June 19, 2020Publication date: January 14, 2021Applicant: Tobii ABInventors: Niklas Ollesson, Magnus Ivarsson, Viktor Åberg, Anna Redz
-
Publication number: 20210011549Abstract: A method of updating a cornea model for a cornea of an eye is disclosed, as well as a corresponding system and storage medium. The method comprises controlling a display to display a stimulus at a first depth, wherein the display is capable of displaying objects at different depths, receiving first sensor data obtained by an eye tracking sensor while the stimulus is displayed at the first depth by the display, controlling the display to display a stimulus at a second depth, wherein the second depth is different than the first depth, receiving second sensor data obtained by the eye tracking sensor while the stimulus is displayed at the second depth by the display, and updating the cornea model based on the first sensor data and the second sensor data.Type: ApplicationFiled: March 30, 2020Publication date: January 14, 2021Applicant: Tobii ABInventors: Mark Ryan, Jonas Sjöstrand, Erik Lindén, Pravin Rana
-
Publication number: 20210012105Abstract: There is provided a method, system, and non-transitory computer-readable storage medium for performing three-dimensional, 3D, position estimation for the cornea center of an eye of a user, using a remote eye tracking system, wherein the position estimation is reliable and robust also when the cornea center moves over time in relation to an imaging device associated with the eye tracking system. This is accomplished by generating, using, and optionally also updating, a cornea movement filter, CMF, in the cornea center position estimation.Type: ApplicationFiled: June 29, 2020Publication date: January 14, 2021Applicant: Tobii ABInventors: David Masko, Magnus Ivarsson, Niklas Ollesson, Anna Redz
-
Publication number: 20210004623Abstract: There is provided a method, system, and non-transitory computer-readable storage medium for controlling an eye tracking system to optimize eye tracking performance under different lighting conditions, by obtaining a first image captured using a camera associated with the eye tracking system, the first image comprising at least part of an iris and at least part of a pupil of an eye illuminated by an illuminator associated with the eye tracking system at a current power of illumination selected from a set of predetermined power levels; determining a contrast value between an the iris and the pupil in the image; and, if the contrast value deviates less than a preset deviation threshold value from a preset minimum contrast value, setting the current power of illumination of the illuminator to the other predetermined power level in the set of predetermined power levels.Type: ApplicationFiled: June 15, 2020Publication date: January 7, 2021Applicant: Tobii ABInventors: Viktor Åberg, Anna Redz, Niklas Ollesson, Dineshkumar Muthusamy, Magnus Ivarsson
-
Patent number: 10884491Abstract: According to the invention, a method for changing information on a display in a vehicle based on a gaze direction of a driver is disclosed. The method may include displaying information on the display in the vehicle. The method may also include receiving gaze data indicative of the gaze direction of a user. The method may further include changing the display based at least in part on the gaze data.Type: GrantFiled: May 1, 2019Date of Patent: January 5, 2021Assignee: Tobii ABInventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Patent number: 10885882Abstract: According to the invention, a method for reducing aliasing artifacts in foveated rendering is disclosed. The method may include accessing a high resolution image and a low resolution image corresponding to the high resolution image, and calculating a difference between a pixel of the high resolution image and a sample associated with the low resolution image. The sample of the low resolution image corresponds to the pixel of the high resolution image. The method may further include modifying the pixel to generate a modified pixel of the high resolution image based on determining that the difference is higher than or equal to a threshold value. The modification may be made such that an updated difference between the modified pixel and the sample is smaller than the original difference.Type: GrantFiled: December 6, 2018Date of Patent: January 5, 2021Assignee: Tobii ABInventors: Daan Pieter Nijs, Fredrik Lindh
-
Publication number: 20200394766Abstract: There is provided systems, methods and computer program products for generating motion blur on image frames, comprising: obtaining gaze data related to an eye movement between consecutive images, determining movement of at least one object in relation to said gaze data by calculating the difference in position of said at least one object and said gaze data between the image frames, forming a motion blur vector and applying a motion blur on an image frame based on said motion blur vector.Type: ApplicationFiled: March 30, 2020Publication date: December 17, 2020Applicant: Tobii ABInventor: Denny Rönngren
-
Publication number: 20200393686Abstract: The present invention relates to a lens for eye-tracking applications. The lens comprises a first protective layer, arranged to face towards the eye to be tracked when the lens is used for eye-tracking. It also comprises at least one light source, at least partly arranged in the first protective layer, arranged to emit a first light from the first protective layer in a direction towards the eye. Moreover, it comprises at least one image capturing device, at least partly arranged in the first protective layer, arranged to receive the first light within the first protective layer. The lens further comprises an absorptive layer, arranged on the far side of the first protective layer seen from the eye to be tracked when the lens is used for eye-tracking, adapted to be absorptive for wavelengths of the majority of the first light.Type: ApplicationFiled: January 31, 2020Publication date: December 17, 2020Applicant: Tobii ABInventors: Axel Tollin, Daniel Ljunggren
-
Publication number: 20200393897Abstract: A lens for eye tracking applications is described. The lens comprises a first protective layer with a first surface, arranged to face towards the eye to be tracked when the lens is used for eye tracking. The lens is characterized in that the lens further comprises a supporting layer and a second protective layer with a second surface, arranged to face away from the eye to be tracked when the lens is used for eye tracking. The supporting layer is arranged between the first protective layer and the second protective layer, and the supporting layer comprises at least a first opening between the first protective layer and the second protective layer. At least one electrical component arranged extending through the first opening.Type: ApplicationFiled: January 31, 2020Publication date: December 17, 2020Applicant: Tobii ABInventors: Daniel Ljunggren, Anders Kingbäck, Axel Tollin, Jan Skagerlund
-
Publication number: 20200393679Abstract: The present disclosure relates to a method for displaying an image with a specific depth of field. The method comprises the steps of obtaining information data related to a focal distance adapted to a user gazing at a display, determining a pupil size of said user, estimating a depth of field of said user's eyes based on said focal distance and said pupil size, and rendering an image based on said depth of field to be displayed on said display. Further, the present disclosure relates to a system, a head-mounted display and a non-transitory computer readable medium.Type: ApplicationFiled: March 30, 2020Publication date: December 17, 2020Applicant: Tobii ABInventor: Denny Rönngren
-
Publication number: 20200394400Abstract: An eye tracking device for tracking an eye is described. The eye tracking device comprises: a first diffractive optical element, DOE, arranged in front of the eye, an image module, wherein the image module is configured to capture an image of the eye via the first DOE. The first DOE is adapted to direct a first portion of incident light reflected from the eye, towards the image module. The eye tracking device is characterized in that the first DOE is configured to provide a lens effect.Type: ApplicationFiled: March 30, 2020Publication date: December 17, 2020Applicant: Tobii ABInventor: Daniel Tornéus
-
Patent number: 10867252Abstract: A method for forming an offset model is described. The offset model represents an estimated offset between a limbus center of a user eye and a pupil center of the user eye as a function of pupil size. The approach includes sampling a set of limbus center values, sampling a set of pupil center values, and sampling a set of radius values. The offset model is formed by comparing a difference between the set of limbus center values and the set of pupil center values at each of the radius values. A system and a computer-readable storage device configured to perform such a method are also disclosed.Type: GrantFiled: December 21, 2018Date of Patent: December 15, 2020Assignee: Tobii ABInventor: Erik Lindén
-
Publication number: 20200386990Abstract: The present invention relates to a lens for eye-tracking applications. The lens comprises a first protective layer, arranged to face towards the eye to be tracked when the lens is used for eye-tracking, and the lens further comprises at least one light source. The light source is arranged to emit a first light and a refractive element, is arranged in the light path of the at least one light source.Type: ApplicationFiled: January 29, 2020Publication date: December 10, 2020Applicant: Tobii ABInventors: Daniel Ljunggren, Anders Kingbäck
-
Publication number: 20200387220Abstract: The present disclosure generally relates to interaction between a user and an apparatus, sometimes referred to as user-apparatus interaction or human-computer interaction. More specifically, the present disclosure generally relates to combined gaze-based and scanning-based control of an apparatus, such as a computer, a tablet computer, or a desktop computer. In more detail, the present disclosure presents methods, apparatuses, computer programs and carriers, which combine gaze-based control with scanning-based control for controlling the apparatus.Type: ApplicationFiled: February 18, 2020Publication date: December 10, 2020Applicant: Tobii ABInventor: Jaén Cantor
-
Publication number: 20200387218Abstract: The embodiments herein relate to a method and a Head-Mounted-Device (HMD) for adaptively adjusting a Head-Up-Display (HUD), wherein the HUD includes a User Interface (UI) or HUD graphics the HMD comprising at least one eye tracker, a processor and a memory containing instructions executable by the processor wherein the HMD is operative to: determine a fixation distance, being a distance to a fixation point a user of said HUD is fixating on; and dynamically adjust said HUD by adjusting the position of the HUD UI, in front of each eye of the user, such that the HUD UI appears to be positioned at the fixation distance.Type: ApplicationFiled: January 30, 2020Publication date: December 10, 2020Applicant: Tobii ABInventor: Geoffrey Cooper
-
Publication number: 20200387757Abstract: Techniques for generating 3D gaze predictions based on a deep learning system are described. In an example, the deep learning system includes a neural network. The neural network is trained with training images. During the training, calibration parameters are initialized and input to the neural network, and are updated through the training. Accordingly, the network parameters of the neural network are updated based in part on the calibration parameters. Upon completion of the training, the neural network is calibrated for a user. This calibration includes initializing and inputting the calibration parameters along with calibration images showing an eye of the user to the neural network. The calibration includes updating the calibration parameters without changing the network parameters by minimizing the loss function of the neural network based on the calibration images. Upon completion of the calibration, the neural network is used to generate 3D gaze information for the user.Type: ApplicationFiled: January 14, 2020Publication date: December 10, 2020Applicant: Tobii ABInventor: Erik Linden
-
Publication number: 20200387221Abstract: A head mountable arrangement assists a subject to acquire spatial information about a surrounding environment by receiving gaze data from an eyetracker and spatial data from a distance sensor respectively. The gaze data describe an estimated point of regard of the subject, and the spatial data describe a distance between a reference point and an object in the surrounding environment. A feedback signal is provided, which indicates a distance from the subject to said object in the surrounding environment. The feedback signal is generated based on the estimated point of regard and the spatial data, and may reflect the distance to a surface element of an object that intersects a straight line between an eye-base line of the subject and the estimated point of regard, which surface element is located closer to the eye-base line than any other surface element of the objects in the surrounding environment along said straight line.Type: ApplicationFiled: February 19, 2020Publication date: December 10, 2020Applicant: Tobii ABInventors: Andrew Ratcliff, Daniel Tornéus, Eli Lundberg, Henrik Andersson
-
Patent number: 10852531Abstract: A method for determining eye openness with an eye tracking device is disclosed. The method may include determining, for pixels of an image sensor of an eye tracking device, during a first time period when an eye of a user is open, a first sum of intensity of the pixels. The method may also include determining, during a second time period when the eye of the user is closed, a second sum of intensity of the pixels. The method may further include determining, during a third time period, a third sum of intensity of the pixels. The method may additionally include determining that upon the third sum exceeding a fourth sum of the first sum plus a threshold amount, that the eye of the user is closed, the threshold amount is equal to a product of a threshold fraction and a difference between the first sum and the second sum.Type: GrantFiled: June 24, 2019Date of Patent: December 1, 2020Assignee: Tobii ABInventors: Mark Ryan, Torbjörn Sundberg, Pravin Rana, Yimu Wang