Patents Assigned to TOBII AB
  • Publication number: 20200348753
    Abstract: The invention relates to an eye tracking device comprising one or more illuminators, each illuminator comprising a light emitting side, and each illuminator being connected to a first circuitry carrier, an imaging module connected to a second circuitry carrier wherein the image module comprises optical arrangements. The plurality of illuminators, the imaging module and the circuitry carriers are embedded without gaps in a first material. The invention further relates to methods for manufacturing an eye tracking device with over-molded components.
    Type: Application
    Filed: June 13, 2019
    Publication date: November 5, 2020
    Applicant: Tobii AB
    Inventors: Eli Lundberg, Richard Hainzl, Daniel Tornéus
  • Patent number: 10820796
    Abstract: A method is disclosed, comprising obtaining a first angular offset between a first eye direction and a first gaze direction of an eye having a first pupil size, obtaining a second angular offset between a second eye direction and a second gaze direction of the eye having a second pupil size, and forming, based on the first angular offset and the second angular offset, a compensation model describing an estimated angular offset as a function of pupil size. A system and a device comprising a circuitry configured to perform such a method are also disclosed.
    Type: Grant
    Filed: September 7, 2018
    Date of Patent: November 3, 2020
    Assignee: Tobii AB
    Inventors: Mark Ryan, Simon Johansson, Erik Lindén
  • Patent number: 10809800
    Abstract: A method and a corresponding eye tracking system for providing an approximate gaze convergence distance of a user in an eye tracking system are disclosed. The method comprises determining calibration data in relation to interpupillary distance between a pupil of a left eye and a pupil of a right eye of a user, determining, based on the determined calibration data, a gaze convergence function providing an approximate gaze convergence distance of the user based on a determined interpupillary distance of the user. The method further comprises receiving, from one or more imaging devices, one or more images of the left eye and the right eye of the user, determining a current interpupillary distance of the user based on the one or more images and determining a current approximate gaze convergence distance based on the current interpupillary distance and the gaze convergence function.
    Type: Grant
    Filed: December 14, 2018
    Date of Patent: October 20, 2020
    Assignee: Tobii AB
    Inventors: Andreas Klingström, Per Fogelström
  • Patent number: 10789464
    Abstract: At least one image registering unit records at least one series of images representing a subject. A control unit controls an operation sequence for the at least one image registering unit in such a manner that a subsequent data processing unit receives a repeating sequence of image frames there from, wherein each period contains at least one image frame of a first resolution and at least one image frame of a second resolution being different from the first resolution. Based on the registered image frames, the data processing unit produces eye/gaze tracking data with respect to the subject.
    Type: Grant
    Filed: April 9, 2019
    Date of Patent: September 29, 2020
    Assignee: Tobii AB
    Inventors: Mattias Kuldkepp, Marten Skogo, Mattias Hanqvist, Martin Brogren, Dineshkumar Muthusamy
  • Publication number: 20200285379
    Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.
    Type: Application
    Filed: July 25, 2019
    Publication date: September 10, 2020
    Applicant: Tobii AB
    Inventor: Erland George-Svahn
  • Publication number: 20200285311
    Abstract: A method for determining if a user's gaze is directed in the direction of a zone of interest in a 3D scene comprises: providing a 3D scene containing a zone of interest; associating a property with the zone of interest; creating a bitmap representing the location of the zone of interest in a projected view of the 3D scene, each pixel of the bitmap to which the zone of interest is projected storing the property of the zone of interest; detecting the direction of the user's gaze; using the bitmap to determine if the detected user's gaze is directed in the direction of the zone of interest.
    Type: Application
    Filed: January 27, 2020
    Publication date: September 10, 2020
    Applicant: Tobii AB
    Inventors: Fredrik Lindh, Mattias Gustavsson, Anders Vennstrom, Andreas Edling
  • Publication number: 20200278746
    Abstract: Method for determining a current gaze direction of a user in relation to a three-dimensional (“3D”) scene, which 3D scene is sampled by a rendering function to produce a two-dimensional (“2D”) projection image of the 3D scene, which sampling is performed based on a virtual camera in turn being associated with a camera position and camera direction in the 3D scene, wherein the method comprises the steps: determining, by a gaze direction detection means, a first gaze direction of the user at a first gaze time point, which first gaze direction is related to said 3D scene; determining a virtual camera 3D transformation, which 3D transformation represents a change of a virtual camera position and/or virtual camera direction between the first gaze time point and a second sampling time point, where the second sampling time point is later than the first gaze time point; and determining the said current gaze direction as a modified gaze direction, in turn calculated based on the first gaze direction, and further calc
    Type: Application
    Filed: February 4, 2020
    Publication date: September 3, 2020
    Applicant: Tobii AB
    Inventor: Fredrik Lindh
  • Patent number: 10761603
    Abstract: A method is disclosed for providing increased accessibility for users of a computing device. The method may include analyzing content on a display device to identify a plurality of interactive elements most likely to be interacted with. The method may also include causing each of the plurality of interactive elements to be highlighted in a different manner. The method may additionally include causing a plurality of graphical elements to be displayed, where each of the plurality of graphical elements may be associated with a different interactive element and may visually correspond with highlighting of its associated interactive element. The method may moreover include determining a location of the user's gaze on the display device and causing a particular interactive element to be activated, based at least in part on the user gazing at its associated graphical element, where activation of the particular interactive element causes display of new content.
    Type: Grant
    Filed: October 16, 2018
    Date of Patent: September 1, 2020
    Assignee: Tobii AB
    Inventors: Anders Borge, Anna Belanova, Chiel van de Ruit, Chris Edson, Christopher Badman, Dmitriy Sukhorukov, Joel Ahlgren, Ole Alexander Mæhle, Ragnar Mjelde, Sveinung Thunes, Xiaohu Chen
  • Patent number: 10754663
    Abstract: According to the invention, a method for determining what hardware components are installed on a computing device is disclosed. The method may include identifying the computing device, and determining, based on the computing device, a hardware component of the computing device. The method may also include retrieving information about the hardware component, and setting, based at least in part on the information about the hardware component, a parameter for an algorithm of software on the computing device.
    Type: Grant
    Filed: April 23, 2018
    Date of Patent: August 25, 2020
    Assignee: Tobii AB
    Inventor: Henrik Eskilsson
  • Publication number: 20200257356
    Abstract: The present disclosure relates to a method for calibrating a camera of a head-mounted display, HMD. The method comprises providing a calibration target in front of a lens of the HMD. Each of the calibration target and the lens basically extend in a corresponding two-dimensional plane. The method further comprises determining a lateral position of the calibration target. The lateral position relates to a position of the calibration target in the two-dimensional plane. The method even further comprises determining a lateral position of the lens. The lateral position relates to a position of the lens in the two-dimensional plane. The method yet even further comprises determining a calibration target misalignment based on the determined lateral position of the calibration target and based on the determined lateral position of the lens. The method also comprises performing a hardware calibration of the HMD. The hardware calibration is adapted for the calibration target misalignment.
    Type: Application
    Filed: May 29, 2019
    Publication date: August 13, 2020
    Applicant: Tobii AB
    Inventor: Mikael Rosell
  • Publication number: 20200257359
    Abstract: The present disclosure relates to an eye tracking method as well as a corresponding system and computer-readable storage medium. Calibration is performed. The calibration comprises estimating values for a plurality of parameters defining respective optical properties of an eye. The plurality of parameters includes a radius of a cornea of the eye. Eye tracking for the eye is performed using the values estimated at the calibration. An updated value of the cornea radius is estimated. Eye tracking for the eye is performed using the updated value of the cornea radius and a value estimated at the calibration for a parameter other than the cornea radius of the eye. The cornea radius of an eye may change over time. This may cause degradation of eye tracking performance. This problem may be reduced or prevented by estimating an updated value of the cornea radius.
    Type: Application
    Filed: December 23, 2019
    Publication date: August 13, 2020
    Applicant: Tobii AB
    Inventor: Andreas Klingström
  • Publication number: 20200257358
    Abstract: There is provided a method, system, and non-transitory computer-readable storage medium for classifying glints using an eye tracking system of a head-mounted device, by obtaining the position of any glint present in a current image of the first eye; obtaining a dataset indicative of the respective position of at least one glint detected in one or more previously captured images of the first eye; and, for each glint present in current image, determining if the position of the glint corresponds to the position of a glint in the dataset and, if the positions correspond, classify the glint as a static glint. The static glints may be excluded from further processing in the eye tracking system. If there has been a movement of an eye of the user of the head-mounted device, embodiments may further comprise updating the dataset based on the current image.
    Type: Application
    Filed: December 23, 2019
    Publication date: August 13, 2020
    Applicant: Tobii AB
    Inventors: Mikael Rosell, Johansson Johansson, Johannes Kron, Macarena Garcia Romero
  • Publication number: 20200257360
    Abstract: The disclosure relates to a method performed by a computer, the method comprising visualizing a plurality of objects, each at a known 3D position, using a 3 display, determining an object of the visualized objects at which a user is watching based on a gaze point, obtaining a gaze convergence distance indicative of a depth the user is watching at, obtaining a reference distance based on the 3D position of the determined object, calculating an updated convergence distance using the obtained gaze convergence distance and the reference distance.
    Type: Application
    Filed: December 23, 2019
    Publication date: August 13, 2020
    Applicant: Tobii AB
    Inventors: Andreas Klingström, Mattias Brand
  • Patent number: 10739851
    Abstract: A device (1300) adapted to be worn by a user is disclosed, comprising an optical element, a light source and a sensor. The optical element is adapted to be arranged in front of an eye (1312) of the user and formed of a light-transmitting material allowing the user to see through the optical element, wherein the light source is arranged on the optical element and adapted to illuminate at least a part of the eye of the user. Further, the sensor is adapted to capture light which has been emitted from the light source and reflected on the eye. A system for is also disclosed, comprising such a device and a processor adapted to determine a gaze direction of the user based on the light captured by the sensor.
    Type: Grant
    Filed: December 30, 2016
    Date of Patent: August 11, 2020
    Assignee: Tobii AB
    Inventors: Richard Hainzl, Jonas Andersson
  • Publication number: 20200250488
    Abstract: Techniques for generating 3D gaze predictions based on a deep learning system are described. In an example, the deep learning system includes a neural network. A scaled image is generated from 2D image showing a user face based on a rough distance between the user eyes and a camera that generated the 2D image. Image crops at different resolutions are generated from the scaled image and include a crop around each of the user eyes and a crop around the user face. These crops are input to the neural network. In response, the neural network outputs a distance correction and a 2D gaze vector per user eye. A corrected eye-to-camera distance is generated by correcting the rough distance based on the distance correction. A 3D gaze vector for each of the user eyes is generated based on the corresponding 2D gaze vector and the corrected distance.
    Type: Application
    Filed: February 11, 2020
    Publication date: August 6, 2020
    Applicant: Tobii AB
    Inventor: Erik Linden
  • Publication number: 20200225494
    Abstract: Disclosed is a method for switching user input modality of a displaying device displaying an interactable region. The displaying device is in communication with a first input modality and a second input modality. The first input modality is an eye tracker configured to determine gaze data of a user and the second input modality is an input modality other than an eye tracker configured to determine a pointing ray. The first input modality is selected as the input modality of the displaying device. The method comprises determining whether the pointing ray of the second input modality is intersecting with the interactable region. The method further comprises, based on the determining switching the input modality of the displaying device to the second input modality when the pointing ray of the second input modality is intersecting with the interactable region.
    Type: Application
    Filed: December 11, 2019
    Publication date: July 16, 2020
    Applicant: Tobii AB
    Inventors: Niklas Blomqvist, Robin Thunström, Dennis Rådell, Ralf Biedert
  • Publication number: 20200225744
    Abstract: A preliminary path for light travelling towards a camera via corneal reflection is estimated based on a preliminary position and orientation of an eye. A position where the reflection would appear in images captured by the camera is estimated. A distance is formed between a detected position of a corneal reflection of an illuminator and the estimated position. A second preliminary path for light travelling through the cornea or from the sclera towards a camera is estimated based on the preliminary position and orientation, and a position where the second preliminary path would appear to originate in images captured by this camera is estimated. A distance is formed between a detected edge of a pupil or iris and the estimated position where the second preliminary path would appear to originate. An updated position and/or orientation of the eye is determined using an objective function formed based on the formed distances.
    Type: Application
    Filed: December 16, 2019
    Publication date: July 16, 2020
    Applicant: Tobii AB
    Inventor: Erik Lindén
  • Publication number: 20200225745
    Abstract: A gaze tracking model is adapted to predict a gaze ray using an image of the eye. The model is trained using training data which comprises a first image of an eye, reference gaze data indicating a gaze point towards which the eye was gazing when the first image was captured, and images of an eye captured by first and second cameras at a point in time. The training comprises forming a distance between the gaze point and a gaze ray predicted by the model using the first image, forming a consistency measure based on a gaze ray predicted by the model using the image captured by the first camera and a gaze ray predicted by the model using the image captured by the second camera, forming an objective function based on at least the formed distance and the consistency measure, and training the model using the objective function.
    Type: Application
    Filed: December 16, 2019
    Publication date: July 16, 2020
    Applicant: Tobii AB
    Inventors: David Molin, Erik Lindén
  • Publication number: 20200225743
    Abstract: The present invention relates to a method for establishing the position of an object in relation to a camera in order to enable gaze tracking with a user watching the object, where the user is in view of the camera. The method comprises the steps of showing a known pattern, consisting of a set of stimulus points (s1, s2, . . . , sN), on the object, detecting gaze rays (g1, g2, . . . , gN) from an eye of the user as the user looks at the stimulus points (s1, s2, . . . , sN), and finding, by means of an optimizer, a position and orientation of the object in relation to the camera such that the gaze rays (g1, g2, . . . , gN) approaches the stimulus points (s1, s2, . . . , sN).
    Type: Application
    Filed: December 11, 2019
    Publication date: July 16, 2020
    Applicant: Tobii AB
    Inventor: Erik Lindén
  • Patent number: 10712817
    Abstract: Technologies for improving foveated rendering of an image by improving the position of the image to be displayed through image re-projection are disclosed. For example, a method may include receiving a first estimation of a predicted gaze point of a user on a display device that is determined before starting rendering a high-quality portion of the image. The method may further include causing the image to be rendered based on the first estimation of the predicted gaze point. The method may also include receiving a second estimation of the predicted gaze point. The second estimation of the predicted gaze point is determined after rendering of the high-quality portion of the image has started. Responsive to determining that the second estimation of the predicted gaze point is different from the first estimation, the method may include adjusting the rendered image based on the second estimation of the predicted gaze point and transmitting the adjusted image to the display device for display.
    Type: Grant
    Filed: June 27, 2019
    Date of Patent: July 14, 2020
    Assignee: Tobii AB
    Inventor: Denny Alexander Rönngren