Patents Assigned to TOBII AB
  • Publication number: 20200393897
    Abstract: A lens for eye tracking applications is described. The lens comprises a first protective layer with a first surface, arranged to face towards the eye to be tracked when the lens is used for eye tracking. The lens is characterized in that the lens further comprises a supporting layer and a second protective layer with a second surface, arranged to face away from the eye to be tracked when the lens is used for eye tracking. The supporting layer is arranged between the first protective layer and the second protective layer, and the supporting layer comprises at least a first opening between the first protective layer and the second protective layer. At least one electrical component arranged extending through the first opening.
    Type: Application
    Filed: January 31, 2020
    Publication date: December 17, 2020
    Applicant: Tobii AB
    Inventors: Daniel Ljunggren, Anders Kingbäck, Axel Tollin, Jan Skagerlund
  • Publication number: 20200393679
    Abstract: The present disclosure relates to a method for displaying an image with a specific depth of field. The method comprises the steps of obtaining information data related to a focal distance adapted to a user gazing at a display, determining a pupil size of said user, estimating a depth of field of said user's eyes based on said focal distance and said pupil size, and rendering an image based on said depth of field to be displayed on said display. Further, the present disclosure relates to a system, a head-mounted display and a non-transitory computer readable medium.
    Type: Application
    Filed: March 30, 2020
    Publication date: December 17, 2020
    Applicant: Tobii AB
    Inventor: Denny Rönngren
  • Publication number: 20200393686
    Abstract: The present invention relates to a lens for eye-tracking applications. The lens comprises a first protective layer, arranged to face towards the eye to be tracked when the lens is used for eye-tracking. It also comprises at least one light source, at least partly arranged in the first protective layer, arranged to emit a first light from the first protective layer in a direction towards the eye. Moreover, it comprises at least one image capturing device, at least partly arranged in the first protective layer, arranged to receive the first light within the first protective layer. The lens further comprises an absorptive layer, arranged on the far side of the first protective layer seen from the eye to be tracked when the lens is used for eye-tracking, adapted to be absorptive for wavelengths of the majority of the first light.
    Type: Application
    Filed: January 31, 2020
    Publication date: December 17, 2020
    Applicant: Tobii AB
    Inventors: Axel Tollin, Daniel Ljunggren
  • Publication number: 20200394766
    Abstract: There is provided systems, methods and computer program products for generating motion blur on image frames, comprising: obtaining gaze data related to an eye movement between consecutive images, determining movement of at least one object in relation to said gaze data by calculating the difference in position of said at least one object and said gaze data between the image frames, forming a motion blur vector and applying a motion blur on an image frame based on said motion blur vector.
    Type: Application
    Filed: March 30, 2020
    Publication date: December 17, 2020
    Applicant: Tobii AB
    Inventor: Denny Rönngren
  • Patent number: 10867252
    Abstract: A method for forming an offset model is described. The offset model represents an estimated offset between a limbus center of a user eye and a pupil center of the user eye as a function of pupil size. The approach includes sampling a set of limbus center values, sampling a set of pupil center values, and sampling a set of radius values. The offset model is formed by comparing a difference between the set of limbus center values and the set of pupil center values at each of the radius values. A system and a computer-readable storage device configured to perform such a method are also disclosed.
    Type: Grant
    Filed: December 21, 2018
    Date of Patent: December 15, 2020
    Assignee: Tobii AB
    Inventor: Erik Lindén
  • Publication number: 20200387220
    Abstract: The present disclosure generally relates to interaction between a user and an apparatus, sometimes referred to as user-apparatus interaction or human-computer interaction. More specifically, the present disclosure generally relates to combined gaze-based and scanning-based control of an apparatus, such as a computer, a tablet computer, or a desktop computer. In more detail, the present disclosure presents methods, apparatuses, computer programs and carriers, which combine gaze-based control with scanning-based control for controlling the apparatus.
    Type: Application
    Filed: February 18, 2020
    Publication date: December 10, 2020
    Applicant: Tobii AB
    Inventor: Jaén Cantor
  • Publication number: 20200387218
    Abstract: The embodiments herein relate to a method and a Head-Mounted-Device (HMD) for adaptively adjusting a Head-Up-Display (HUD), wherein the HUD includes a User Interface (UI) or HUD graphics the HMD comprising at least one eye tracker, a processor and a memory containing instructions executable by the processor wherein the HMD is operative to: determine a fixation distance, being a distance to a fixation point a user of said HUD is fixating on; and dynamically adjust said HUD by adjusting the position of the HUD UI, in front of each eye of the user, such that the HUD UI appears to be positioned at the fixation distance.
    Type: Application
    Filed: January 30, 2020
    Publication date: December 10, 2020
    Applicant: Tobii AB
    Inventor: Geoffrey Cooper
  • Publication number: 20200387757
    Abstract: Techniques for generating 3D gaze predictions based on a deep learning system are described. In an example, the deep learning system includes a neural network. The neural network is trained with training images. During the training, calibration parameters are initialized and input to the neural network, and are updated through the training. Accordingly, the network parameters of the neural network are updated based in part on the calibration parameters. Upon completion of the training, the neural network is calibrated for a user. This calibration includes initializing and inputting the calibration parameters along with calibration images showing an eye of the user to the neural network. The calibration includes updating the calibration parameters without changing the network parameters by minimizing the loss function of the neural network based on the calibration images. Upon completion of the calibration, the neural network is used to generate 3D gaze information for the user.
    Type: Application
    Filed: January 14, 2020
    Publication date: December 10, 2020
    Applicant: Tobii AB
    Inventor: Erik Linden
  • Publication number: 20200386990
    Abstract: The present invention relates to a lens for eye-tracking applications. The lens comprises a first protective layer, arranged to face towards the eye to be tracked when the lens is used for eye-tracking, and the lens further comprises at least one light source. The light source is arranged to emit a first light and a refractive element, is arranged in the light path of the at least one light source.
    Type: Application
    Filed: January 29, 2020
    Publication date: December 10, 2020
    Applicant: Tobii AB
    Inventors: Daniel Ljunggren, Anders Kingbäck
  • Publication number: 20200387221
    Abstract: A head mountable arrangement assists a subject to acquire spatial information about a surrounding environment by receiving gaze data from an eyetracker and spatial data from a distance sensor respectively. The gaze data describe an estimated point of regard of the subject, and the spatial data describe a distance between a reference point and an object in the surrounding environment. A feedback signal is provided, which indicates a distance from the subject to said object in the surrounding environment. The feedback signal is generated based on the estimated point of regard and the spatial data, and may reflect the distance to a surface element of an object that intersects a straight line between an eye-base line of the subject and the estimated point of regard, which surface element is located closer to the eye-base line than any other surface element of the objects in the surrounding environment along said straight line.
    Type: Application
    Filed: February 19, 2020
    Publication date: December 10, 2020
    Applicant: Tobii AB
    Inventors: Andrew Ratcliff, Daniel Tornéus, Eli Lundberg, Henrik Andersson
  • Patent number: 10852531
    Abstract: A method for determining eye openness with an eye tracking device is disclosed. The method may include determining, for pixels of an image sensor of an eye tracking device, during a first time period when an eye of a user is open, a first sum of intensity of the pixels. The method may also include determining, during a second time period when the eye of the user is closed, a second sum of intensity of the pixels. The method may further include determining, during a third time period, a third sum of intensity of the pixels. The method may additionally include determining that upon the third sum exceeding a fourth sum of the first sum plus a threshold amount, that the eye of the user is closed, the threshold amount is equal to a product of a threshold fraction and a difference between the first sum and the second sum.
    Type: Grant
    Filed: June 24, 2019
    Date of Patent: December 1, 2020
    Assignee: Tobii AB
    Inventors: Mark Ryan, Torbjörn Sundberg, Pravin Rana, Yimu Wang
  • Publication number: 20200348753
    Abstract: The invention relates to an eye tracking device comprising one or more illuminators, each illuminator comprising a light emitting side, and each illuminator being connected to a first circuitry carrier, an imaging module connected to a second circuitry carrier wherein the image module comprises optical arrangements. The plurality of illuminators, the imaging module and the circuitry carriers are embedded without gaps in a first material. The invention further relates to methods for manufacturing an eye tracking device with over-molded components.
    Type: Application
    Filed: June 13, 2019
    Publication date: November 5, 2020
    Applicant: Tobii AB
    Inventors: Eli Lundberg, Richard Hainzl, Daniel Tornéus
  • Patent number: 10820796
    Abstract: A method is disclosed, comprising obtaining a first angular offset between a first eye direction and a first gaze direction of an eye having a first pupil size, obtaining a second angular offset between a second eye direction and a second gaze direction of the eye having a second pupil size, and forming, based on the first angular offset and the second angular offset, a compensation model describing an estimated angular offset as a function of pupil size. A system and a device comprising a circuitry configured to perform such a method are also disclosed.
    Type: Grant
    Filed: September 7, 2018
    Date of Patent: November 3, 2020
    Assignee: Tobii AB
    Inventors: Mark Ryan, Simon Johansson, Erik Lindén
  • Patent number: 10809800
    Abstract: A method and a corresponding eye tracking system for providing an approximate gaze convergence distance of a user in an eye tracking system are disclosed. The method comprises determining calibration data in relation to interpupillary distance between a pupil of a left eye and a pupil of a right eye of a user, determining, based on the determined calibration data, a gaze convergence function providing an approximate gaze convergence distance of the user based on a determined interpupillary distance of the user. The method further comprises receiving, from one or more imaging devices, one or more images of the left eye and the right eye of the user, determining a current interpupillary distance of the user based on the one or more images and determining a current approximate gaze convergence distance based on the current interpupillary distance and the gaze convergence function.
    Type: Grant
    Filed: December 14, 2018
    Date of Patent: October 20, 2020
    Assignee: Tobii AB
    Inventors: Andreas Klingström, Per Fogelström
  • Patent number: 10789464
    Abstract: At least one image registering unit records at least one series of images representing a subject. A control unit controls an operation sequence for the at least one image registering unit in such a manner that a subsequent data processing unit receives a repeating sequence of image frames there from, wherein each period contains at least one image frame of a first resolution and at least one image frame of a second resolution being different from the first resolution. Based on the registered image frames, the data processing unit produces eye/gaze tracking data with respect to the subject.
    Type: Grant
    Filed: April 9, 2019
    Date of Patent: September 29, 2020
    Assignee: Tobii AB
    Inventors: Mattias Kuldkepp, Marten Skogo, Mattias Hanqvist, Martin Brogren, Dineshkumar Muthusamy
  • Publication number: 20200285379
    Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.
    Type: Application
    Filed: July 25, 2019
    Publication date: September 10, 2020
    Applicant: Tobii AB
    Inventor: Erland George-Svahn
  • Publication number: 20200285311
    Abstract: A method for determining if a user's gaze is directed in the direction of a zone of interest in a 3D scene comprises: providing a 3D scene containing a zone of interest; associating a property with the zone of interest; creating a bitmap representing the location of the zone of interest in a projected view of the 3D scene, each pixel of the bitmap to which the zone of interest is projected storing the property of the zone of interest; detecting the direction of the user's gaze; using the bitmap to determine if the detected user's gaze is directed in the direction of the zone of interest.
    Type: Application
    Filed: January 27, 2020
    Publication date: September 10, 2020
    Applicant: Tobii AB
    Inventors: Fredrik Lindh, Mattias Gustavsson, Anders Vennstrom, Andreas Edling
  • Publication number: 20200278746
    Abstract: Method for determining a current gaze direction of a user in relation to a three-dimensional (“3D”) scene, which 3D scene is sampled by a rendering function to produce a two-dimensional (“2D”) projection image of the 3D scene, which sampling is performed based on a virtual camera in turn being associated with a camera position and camera direction in the 3D scene, wherein the method comprises the steps: determining, by a gaze direction detection means, a first gaze direction of the user at a first gaze time point, which first gaze direction is related to said 3D scene; determining a virtual camera 3D transformation, which 3D transformation represents a change of a virtual camera position and/or virtual camera direction between the first gaze time point and a second sampling time point, where the second sampling time point is later than the first gaze time point; and determining the said current gaze direction as a modified gaze direction, in turn calculated based on the first gaze direction, and further calc
    Type: Application
    Filed: February 4, 2020
    Publication date: September 3, 2020
    Applicant: Tobii AB
    Inventor: Fredrik Lindh
  • Patent number: 10761603
    Abstract: A method is disclosed for providing increased accessibility for users of a computing device. The method may include analyzing content on a display device to identify a plurality of interactive elements most likely to be interacted with. The method may also include causing each of the plurality of interactive elements to be highlighted in a different manner. The method may additionally include causing a plurality of graphical elements to be displayed, where each of the plurality of graphical elements may be associated with a different interactive element and may visually correspond with highlighting of its associated interactive element. The method may moreover include determining a location of the user's gaze on the display device and causing a particular interactive element to be activated, based at least in part on the user gazing at its associated graphical element, where activation of the particular interactive element causes display of new content.
    Type: Grant
    Filed: October 16, 2018
    Date of Patent: September 1, 2020
    Assignee: Tobii AB
    Inventors: Anders Borge, Anna Belanova, Chiel van de Ruit, Chris Edson, Christopher Badman, Dmitriy Sukhorukov, Joel Ahlgren, Ole Alexander Mæhle, Ragnar Mjelde, Sveinung Thunes, Xiaohu Chen
  • Patent number: 10754663
    Abstract: According to the invention, a method for determining what hardware components are installed on a computing device is disclosed. The method may include identifying the computing device, and determining, based on the computing device, a hardware component of the computing device. The method may also include retrieving information about the hardware component, and setting, based at least in part on the information about the hardware component, a parameter for an algorithm of software on the computing device.
    Type: Grant
    Filed: April 23, 2018
    Date of Patent: August 25, 2020
    Assignee: Tobii AB
    Inventor: Henrik Eskilsson