Patents Assigned to TOBII AB
  • Patent number: 10705600
    Abstract: An eye-tracking system comprises an illuminator that directs infrared light towards a detection region, a camera configured to generate an image of the detection region and a controller configured to detect a calibration target in the image and to detect or determine changes in the position and/or orientation of the camera relative to the calibration target. The eye-tracking system can be installed in a vehicle.
    Type: Grant
    Filed: July 21, 2015
    Date of Patent: July 7, 2020
    Assignee: Tobii AB
    Inventor: Marten Skogo
  • Patent number: 10708477
    Abstract: According to the invention, a system for converting sound to electrical signals is disclosed. The system may include a gaze tracking device and a microphone. The gaze tracking device may determine a gaze direction of a user. The microphone may be more sensitive in a selected direction than at least one other direction and alter the selected direction based at least in part on the gaze direction determined by the gaze tracking device.
    Type: Grant
    Filed: October 23, 2018
    Date of Patent: July 7, 2020
    Assignee: Tobii AB
    Inventors: David Figgins Henderek, Mårten Skogö
  • Patent number: 10699663
    Abstract: A system for providing an image on a display is disclosed. The system may include a scaler chip. The scaler chip may be configured to receive video data. The scaler chip may be configured to receive eye tracking data. The scaler chip may be configured to cause a display to present an image, where the image is based on the video data and the eye tracking data.
    Type: Grant
    Filed: February 27, 2018
    Date of Patent: June 30, 2020
    Assignee: Tobii AB
    Inventors: Carl Korobkin, Ajinkya Waghulde
  • Publication number: 20200192474
    Abstract: Disclosed is a method for calibrating an eye-tracking device to suit a user of the eye-tracking device, wherein a calibration setting of the eye-tracking device—associated with a user is calculated based on acquired eye data of the user when looking at a set of reference points. The method comprises displaying a reference point of the set to the user; acquiring, by means of at least one camera of the eye-tracking device, eye data for at least one of the eyes of the user when looking at the reference point; comparing the acquired eye data to stored eye data sets related to the reference point, wherein each of the stored eye data sets is associated with a calibration setting of the eye-tracking device; and if the acquired eye data matches one of the stored eye data sets, abandoning the calibration process and loading the calibration setting associated with the matching stored eye data.
    Type: Application
    Filed: November 12, 2019
    Publication date: June 18, 2020
    Applicant: Tobii AB
    Inventors: Robin Thunström, Tobias Höglund
  • Publication number: 20200187774
    Abstract: A method for controlling illuminators in an eye tracking system and a corresponding system are disclosed. The system includes a plurality of illuminators arranged such that each illuminator of the illuminators is located at a respective fixed position in relation to an eye of a user when using the system. The method comprises illuminating the eye of the user by means of the plurality of illuminators, and receiving an image of the eye of the user from an image sensor, the image resulting from the image sensor detecting light from the plurality of illuminators reflected from the eye of the user and reflected from an optic arrangement located between the plurality of illuminators and the eye of the user.
    Type: Application
    Filed: July 9, 2019
    Publication date: June 18, 2020
    Applicant: Tobii AB
    Inventors: Pravin Rana, Daniel Tornéus, Jonas Andersson
  • Publication number: 20200192472
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: August 31, 2017
    Publication date: June 18, 2020
    Applicant: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20200192625
    Abstract: According to the invention, a method for providing audio to a user is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user on a display. The method may also include causing, with a computer system, an audio device to produce audio to the user, where content of the audio may be based at least in part on the gaze point of the user on the display.
    Type: Application
    Filed: October 8, 2019
    Publication date: June 18, 2020
    Applicant: Tobii AB
    Inventors: Anders Vennström, Fredrik Lindh
  • Publication number: 20200195915
    Abstract: Images of an eye are captured at respective time instances by a camera of a head-mounted device. For each time instance, a position of a center of corneal curvature is estimated using an image captured at that time instance, a position of a pupil center is estimated using an image captured at that time instance, and a line is determined through the estimated corneal curvature center position and the estimated pupil center position. A first estimated position of a center of the eye is computed based on the lines determined for time instances in a first time period. A second estimated position of the center of the eye is computed based on the lines determined for time instances in a second time period. Relocation of the head-mounted device relative to a user's head is detected based on the first and second estimated positions of the eye center.
    Type: Application
    Filed: October 31, 2019
    Publication date: June 18, 2020
    Applicant: Tobii AB
    Inventors: Tiesheng Wang, Pravin Kumar Rana, Yimu Wang, Mark Ryan
  • Publication number: 20200192475
    Abstract: A method for mapping an input device to a virtual object in virtual space displayed on a display device is disclosed. The method may include determining, via an eye tracking device, a gaze direction of a user. The method may also include, based at least in part on the gaze direction being directed to a virtual object in virtual space displayed on a display device, modifying an action to be taken by one or more processors in response to receiving a first input from an input device. The method may further include, thereafter, in response to receiving the input from the input device, causing the action to occur, wherein the action correlates the first input to an interaction with the virtual object.
    Type: Application
    Filed: February 20, 2020
    Publication date: June 18, 2020
    Applicant: Tobii AB
    Inventors: Simon Gustafsson, Alexey Bezugly, Anders Kingbäck, Anders Clausen
  • Publication number: 20200192473
    Abstract: Images of an eye are captured by a camera. For each of the images, gaze data is obtained and a position of a pupil center is estimated in the image. The gaze data indicates a gaze point and/or gaze direction of the eye when the image was captured. A mapping is calibrated using the obtained gaze data and the estimated positions of the pupil center. The mapping maps positions of the pupil center in images captured by the camera to gaze points at a surface, or to gaze directions. A further image of the eye is captured by the camera. A position of the pupil center is estimated in the further image. Gaze tracking is performed using the calibrated mapping and the estimated position of the pupil center in the further image. These steps may for example be performed at a HMD.
    Type: Application
    Filed: October 31, 2019
    Publication date: June 18, 2020
    Applicant: Tobii AB
    Inventors: Tiesheng Wang, Gilfredo Remon Salazar Salazar Salazar, Yimu Wang, Pravin Kumar Rana, Johannes Kron, Mark Ryan, Torbjörn Sundberg
  • Patent number: 10686972
    Abstract: A method for rotating a field of view represented by a displayed image is disclosed. The method may include displaying a first image representing a first field of view. The method may also include determining a gaze direction of a user toward the first image. The method may further include identifying a subject in the first image at which the gaze direction is directed, wherein the subject is in a first direction from a center of the first image. The method may further include receiving a directional input in a second direction. The method may additionally include, based at least in part on the second direction being substantially the same as the first direction, displaying a second image representing a second field of view, wherein the subject is centered in the second image.
    Type: Grant
    Filed: March 30, 2018
    Date of Patent: June 16, 2020
    Assignee: Tobii AB
    Inventor: Denny Rönngren
  • Publication number: 20200183490
    Abstract: Circuitry of a gaze/eye tracking system obtains one or more images of a left eye and one or more images a right eye, determines a gaze direction of the left eye based on at least one obtained image of the left eye, determines a gaze direction of the right eye based on at least one obtained image of the right eye, determines a first confidence value based on the one or more obtained images of the left eye, determines a second confidence value based on the one or more obtained images of the right eye, and determines a final gaze direction based at least in part on the first confidence value and the second confidence value. The first and second confidence values represent indications of the reliability of the determined gaze directions of the left eye and the right eye, respectively. Corresponding methods and computer-readable media are also provided.
    Type: Application
    Filed: September 11, 2017
    Publication date: June 11, 2020
    Applicant: Tobii AB
    Inventors: Andres Klingström, Mattias Kuldkepp, Mårten Skogö
  • Publication number: 20200184933
    Abstract: According to the invention, a method for reducing aliasing artifacts in foveated rendering is disclosed. The method may include accessing a high resolution image and a low resolution image corresponding to the high resolution image, and calculating a difference between a pixel of the high resolution image and a sample associated with the low resolution image. The sample of the low resolution image corresponds to the pixel of the high resolution image. The method may further include modifying the pixel to generate a modified pixel of the high resolution image based on determining that the difference is higher than or equal to a threshold value. The modification may be made such that an updated difference between the modified pixel and the sample is smaller than the original difference.
    Type: Application
    Filed: December 6, 2018
    Publication date: June 11, 2020
    Applicant: Tobii AB
    Inventors: Daan Pieter Nijs, Fredrik Lindh
  • Patent number: 10678897
    Abstract: According to the invention a system for authenticating a user of a device is disclosed. The system may include a first image sensor, a determination unit, and an authentication unit. The first image sensor may be for capturing at least one image of at least part of a user. The determination unit may be for determining information relating to the user's eye based at least in part on at least one image captured by the first image sensor. The authentication unit may be for authenticating the user using the information relating to the user's eye.
    Type: Grant
    Filed: December 30, 2016
    Date of Patent: June 9, 2020
    Assignee: Tobii AB
    Inventors: Mårten Skogö, Richard Hainzl, Henrik Jönsson, Anders Vennström, Erland George-Svahn, John Elvesjö, Mattias Gustavsson
  • Publication number: 20200174561
    Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.
    Type: Application
    Filed: June 18, 2019
    Publication date: June 4, 2020
    Applicant: Tobii AB
    Inventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö
  • Patent number: 10671890
    Abstract: Techniques for generating 3D gaze predictions based on a deep learning system are described. In an example, the deep learning system includes a neural network. The neural network is trained with training images generated by cameras and showing eyes of user while gazing at stimulus points. Some of the stimulus points are in the planes of the camera. Remaining stimulus points are not un the planes of the cameras. The training includes inputting a first training image associated with a stimulus point in a camera plane and inputting a second training image associated with a stimulus point outside the camera plane. The training minimizes a loss function of the neural network based on a distance between at least one of the stimulus points and a gaze line.
    Type: Grant
    Filed: March 30, 2018
    Date of Patent: June 2, 2020
    Assignee: Tobii AB
    Inventor: Erik Linden
  • Publication number: 20200166994
    Abstract: Techniques for controlling light sources used in eye tracking are described. In an example, an eye tracking system generates a first image and a second image showing at least a portion of the user eye illuminated by a predetermined set of illuminators of the eye tracking system. The eye tracking system determines a first position of a glint in the first image and a second position of the glint in the second image. Each of the first position and the second position is relative to a pupil edge. The eye tracking system predicts a third position of the glint relative to the pupil edge based on the first position and the second position. Further, the eye tracking system determines, from the predetermined set, an illuminator that corresponds to the glint and determines, based on the third position, whether to power off the illuminator to generate a third image of at least the portion of the user eye.
    Type: Application
    Filed: November 26, 2018
    Publication date: May 28, 2020
    Applicant: Tobii AB
    Inventors: Daniel Johansson Tornéus, Andreas Klingström, Martin Skärbäck
  • Patent number: 10664051
    Abstract: A graphics presentation apparatus including a display unit, an eye-tracking module, and a data output module. The eye-tracking module registers image data representing at least one eye of a user of the apparatus. Furthermore, the eye-tracking module determines, based on the registered image data, an orientation of the at least one eye relative to the display unit. Finally, in response thereto, the eye-tracking module generates a control signal controlling the data output module to produce visual content with such orientation on the display unit that a misalignment between the orientation of said at least one part and the orientation of the at least one eye of the user is minimized.
    Type: Grant
    Filed: October 26, 2018
    Date of Patent: May 26, 2020
    Assignee: Tobii AB
    Inventors: Aron Yu, Marten Skogo, Robert Gavelin, Per Nystedt
  • Publication number: 20200159012
    Abstract: The disclosure relates to an eye tracking device for tracking movements of an eye comprising, a viewing plane for displaying a projection of an image to the eye of a user, an image module placed on a same side of the viewing plane as the eye, at least one illuminator for illuminating the eye, a control unit adapted to receive an image captured by the image module, and calculate a viewing angle of the eye, a holographic optical element (HOE), wherein a HOE is placed between the eye and the viewing plane, wherein the image module is adapted to capture an image of the HOE, and wherein the HOE is adapted to direct at last a first portion of incident light reflected from the eye, in a first angle towards the image module, the first angle being different from an angle of the incidence of the incident light
    Type: Application
    Filed: June 28, 2019
    Publication date: May 21, 2020
    Applicant: Tobii AB
    Inventors: Daniel Tornèus, Peter Schef, Magnus Arvidsson, Peter Blixt
  • Publication number: 20200159316
    Abstract: The present invention relates to systems and methods for assisting a user when interacting with a graphical user interface by combining eye based input with input for e.g. selection and activation of objects and object parts and execution of contextual actions related to the objects and object parts. The present invention also relates to such systems and methods in which the user can configure and customize specific combinations of eye data input and input that should result in a specific contextual action.
    Type: Application
    Filed: June 19, 2019
    Publication date: May 21, 2020
    Applicant: Tobii AB
    Inventors: Aron YU, John Mikael HOLTZ ELVESJO