Patents Assigned to TOBII AB
-
Publication number: 20200225494Abstract: Disclosed is a method for switching user input modality of a displaying device displaying an interactable region. The displaying device is in communication with a first input modality and a second input modality. The first input modality is an eye tracker configured to determine gaze data of a user and the second input modality is an input modality other than an eye tracker configured to determine a pointing ray. The first input modality is selected as the input modality of the displaying device. The method comprises determining whether the pointing ray of the second input modality is intersecting with the interactable region. The method further comprises, based on the determining switching the input modality of the displaying device to the second input modality when the pointing ray of the second input modality is intersecting with the interactable region.Type: ApplicationFiled: December 11, 2019Publication date: July 16, 2020Applicant: Tobii ABInventors: Niklas Blomqvist, Robin Thunström, Dennis Rådell, Ralf Biedert
-
Publication number: 20200225743Abstract: The present invention relates to a method for establishing the position of an object in relation to a camera in order to enable gaze tracking with a user watching the object, where the user is in view of the camera. The method comprises the steps of showing a known pattern, consisting of a set of stimulus points (s1, s2, . . . , sN), on the object, detecting gaze rays (g1, g2, . . . , gN) from an eye of the user as the user looks at the stimulus points (s1, s2, . . . , sN), and finding, by means of an optimizer, a position and orientation of the object in relation to the camera such that the gaze rays (g1, g2, . . . , gN) approaches the stimulus points (s1, s2, . . . , sN).Type: ApplicationFiled: December 11, 2019Publication date: July 16, 2020Applicant: Tobii ABInventor: Erik Lindén
-
Publication number: 20200225745Abstract: A gaze tracking model is adapted to predict a gaze ray using an image of the eye. The model is trained using training data which comprises a first image of an eye, reference gaze data indicating a gaze point towards which the eye was gazing when the first image was captured, and images of an eye captured by first and second cameras at a point in time. The training comprises forming a distance between the gaze point and a gaze ray predicted by the model using the first image, forming a consistency measure based on a gaze ray predicted by the model using the image captured by the first camera and a gaze ray predicted by the model using the image captured by the second camera, forming an objective function based on at least the formed distance and the consistency measure, and training the model using the objective function.Type: ApplicationFiled: December 16, 2019Publication date: July 16, 2020Applicant: Tobii ABInventors: David Molin, Erik Lindén
-
Publication number: 20200225744Abstract: A preliminary path for light travelling towards a camera via corneal reflection is estimated based on a preliminary position and orientation of an eye. A position where the reflection would appear in images captured by the camera is estimated. A distance is formed between a detected position of a corneal reflection of an illuminator and the estimated position. A second preliminary path for light travelling through the cornea or from the sclera towards a camera is estimated based on the preliminary position and orientation, and a position where the second preliminary path would appear to originate in images captured by this camera is estimated. A distance is formed between a detected edge of a pupil or iris and the estimated position where the second preliminary path would appear to originate. An updated position and/or orientation of the eye is determined using an objective function formed based on the formed distances.Type: ApplicationFiled: December 16, 2019Publication date: July 16, 2020Applicant: Tobii ABInventor: Erik Lindén
-
Patent number: 10712817Abstract: Technologies for improving foveated rendering of an image by improving the position of the image to be displayed through image re-projection are disclosed. For example, a method may include receiving a first estimation of a predicted gaze point of a user on a display device that is determined before starting rendering a high-quality portion of the image. The method may further include causing the image to be rendered based on the first estimation of the predicted gaze point. The method may also include receiving a second estimation of the predicted gaze point. The second estimation of the predicted gaze point is determined after rendering of the high-quality portion of the image has started. Responsive to determining that the second estimation of the predicted gaze point is different from the first estimation, the method may include adjusting the rendered image based on the second estimation of the predicted gaze point and transmitting the adjusted image to the display device for display.Type: GrantFiled: June 27, 2019Date of Patent: July 14, 2020Assignee: Tobii ABInventor: Denny Alexander Rönngren
-
Patent number: 10705600Abstract: An eye-tracking system comprises an illuminator that directs infrared light towards a detection region, a camera configured to generate an image of the detection region and a controller configured to detect a calibration target in the image and to detect or determine changes in the position and/or orientation of the camera relative to the calibration target. The eye-tracking system can be installed in a vehicle.Type: GrantFiled: July 21, 2015Date of Patent: July 7, 2020Assignee: Tobii ABInventor: Marten Skogo
-
Patent number: 10708477Abstract: According to the invention, a system for converting sound to electrical signals is disclosed. The system may include a gaze tracking device and a microphone. The gaze tracking device may determine a gaze direction of a user. The microphone may be more sensitive in a selected direction than at least one other direction and alter the selected direction based at least in part on the gaze direction determined by the gaze tracking device.Type: GrantFiled: October 23, 2018Date of Patent: July 7, 2020Assignee: Tobii ABInventors: David Figgins Henderek, Mårten Skogö
-
Patent number: 10699663Abstract: A system for providing an image on a display is disclosed. The system may include a scaler chip. The scaler chip may be configured to receive video data. The scaler chip may be configured to receive eye tracking data. The scaler chip may be configured to cause a display to present an image, where the image is based on the video data and the eye tracking data.Type: GrantFiled: February 27, 2018Date of Patent: June 30, 2020Assignee: Tobii ABInventors: Carl Korobkin, Ajinkya Waghulde
-
Publication number: 20200192472Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: ApplicationFiled: August 31, 2017Publication date: June 18, 2020Applicant: Tobii ABInventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Publication number: 20200192625Abstract: According to the invention, a method for providing audio to a user is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user on a display. The method may also include causing, with a computer system, an audio device to produce audio to the user, where content of the audio may be based at least in part on the gaze point of the user on the display.Type: ApplicationFiled: October 8, 2019Publication date: June 18, 2020Applicant: Tobii ABInventors: Anders Vennström, Fredrik Lindh
-
Publication number: 20200192474Abstract: Disclosed is a method for calibrating an eye-tracking device to suit a user of the eye-tracking device, wherein a calibration setting of the eye-tracking device—associated with a user is calculated based on acquired eye data of the user when looking at a set of reference points. The method comprises displaying a reference point of the set to the user; acquiring, by means of at least one camera of the eye-tracking device, eye data for at least one of the eyes of the user when looking at the reference point; comparing the acquired eye data to stored eye data sets related to the reference point, wherein each of the stored eye data sets is associated with a calibration setting of the eye-tracking device; and if the acquired eye data matches one of the stored eye data sets, abandoning the calibration process and loading the calibration setting associated with the matching stored eye data.Type: ApplicationFiled: November 12, 2019Publication date: June 18, 2020Applicant: Tobii ABInventors: Robin Thunström, Tobias Höglund
-
Publication number: 20200195915Abstract: Images of an eye are captured at respective time instances by a camera of a head-mounted device. For each time instance, a position of a center of corneal curvature is estimated using an image captured at that time instance, a position of a pupil center is estimated using an image captured at that time instance, and a line is determined through the estimated corneal curvature center position and the estimated pupil center position. A first estimated position of a center of the eye is computed based on the lines determined for time instances in a first time period. A second estimated position of the center of the eye is computed based on the lines determined for time instances in a second time period. Relocation of the head-mounted device relative to a user's head is detected based on the first and second estimated positions of the eye center.Type: ApplicationFiled: October 31, 2019Publication date: June 18, 2020Applicant: Tobii ABInventors: Tiesheng Wang, Pravin Kumar Rana, Yimu Wang, Mark Ryan
-
Publication number: 20200192473Abstract: Images of an eye are captured by a camera. For each of the images, gaze data is obtained and a position of a pupil center is estimated in the image. The gaze data indicates a gaze point and/or gaze direction of the eye when the image was captured. A mapping is calibrated using the obtained gaze data and the estimated positions of the pupil center. The mapping maps positions of the pupil center in images captured by the camera to gaze points at a surface, or to gaze directions. A further image of the eye is captured by the camera. A position of the pupil center is estimated in the further image. Gaze tracking is performed using the calibrated mapping and the estimated position of the pupil center in the further image. These steps may for example be performed at a HMD.Type: ApplicationFiled: October 31, 2019Publication date: June 18, 2020Applicant: Tobii ABInventors: Tiesheng Wang, Gilfredo Remon Salazar Salazar Salazar, Yimu Wang, Pravin Kumar Rana, Johannes Kron, Mark Ryan, Torbjörn Sundberg
-
Publication number: 20200192475Abstract: A method for mapping an input device to a virtual object in virtual space displayed on a display device is disclosed. The method may include determining, via an eye tracking device, a gaze direction of a user. The method may also include, based at least in part on the gaze direction being directed to a virtual object in virtual space displayed on a display device, modifying an action to be taken by one or more processors in response to receiving a first input from an input device. The method may further include, thereafter, in response to receiving the input from the input device, causing the action to occur, wherein the action correlates the first input to an interaction with the virtual object.Type: ApplicationFiled: February 20, 2020Publication date: June 18, 2020Applicant: Tobii ABInventors: Simon Gustafsson, Alexey Bezugly, Anders Kingbäck, Anders Clausen
-
Publication number: 20200187774Abstract: A method for controlling illuminators in an eye tracking system and a corresponding system are disclosed. The system includes a plurality of illuminators arranged such that each illuminator of the illuminators is located at a respective fixed position in relation to an eye of a user when using the system. The method comprises illuminating the eye of the user by means of the plurality of illuminators, and receiving an image of the eye of the user from an image sensor, the image resulting from the image sensor detecting light from the plurality of illuminators reflected from the eye of the user and reflected from an optic arrangement located between the plurality of illuminators and the eye of the user.Type: ApplicationFiled: July 9, 2019Publication date: June 18, 2020Applicant: Tobii ABInventors: Pravin Rana, Daniel Tornéus, Jonas Andersson
-
Patent number: 10686972Abstract: A method for rotating a field of view represented by a displayed image is disclosed. The method may include displaying a first image representing a first field of view. The method may also include determining a gaze direction of a user toward the first image. The method may further include identifying a subject in the first image at which the gaze direction is directed, wherein the subject is in a first direction from a center of the first image. The method may further include receiving a directional input in a second direction. The method may additionally include, based at least in part on the second direction being substantially the same as the first direction, displaying a second image representing a second field of view, wherein the subject is centered in the second image.Type: GrantFiled: March 30, 2018Date of Patent: June 16, 2020Assignee: Tobii ABInventor: Denny Rönngren
-
Publication number: 20200183490Abstract: Circuitry of a gaze/eye tracking system obtains one or more images of a left eye and one or more images a right eye, determines a gaze direction of the left eye based on at least one obtained image of the left eye, determines a gaze direction of the right eye based on at least one obtained image of the right eye, determines a first confidence value based on the one or more obtained images of the left eye, determines a second confidence value based on the one or more obtained images of the right eye, and determines a final gaze direction based at least in part on the first confidence value and the second confidence value. The first and second confidence values represent indications of the reliability of the determined gaze directions of the left eye and the right eye, respectively. Corresponding methods and computer-readable media are also provided.Type: ApplicationFiled: September 11, 2017Publication date: June 11, 2020Applicant: Tobii ABInventors: Andres Klingström, Mattias Kuldkepp, Mårten Skogö
-
Publication number: 20200184933Abstract: According to the invention, a method for reducing aliasing artifacts in foveated rendering is disclosed. The method may include accessing a high resolution image and a low resolution image corresponding to the high resolution image, and calculating a difference between a pixel of the high resolution image and a sample associated with the low resolution image. The sample of the low resolution image corresponds to the pixel of the high resolution image. The method may further include modifying the pixel to generate a modified pixel of the high resolution image based on determining that the difference is higher than or equal to a threshold value. The modification may be made such that an updated difference between the modified pixel and the sample is smaller than the original difference.Type: ApplicationFiled: December 6, 2018Publication date: June 11, 2020Applicant: Tobii ABInventors: Daan Pieter Nijs, Fredrik Lindh
-
Patent number: 10678897Abstract: According to the invention a system for authenticating a user of a device is disclosed. The system may include a first image sensor, a determination unit, and an authentication unit. The first image sensor may be for capturing at least one image of at least part of a user. The determination unit may be for determining information relating to the user's eye based at least in part on at least one image captured by the first image sensor. The authentication unit may be for authenticating the user using the information relating to the user's eye.Type: GrantFiled: December 30, 2016Date of Patent: June 9, 2020Assignee: Tobii ABInventors: Mårten Skogö, Richard Hainzl, Henrik Jönsson, Anders Vennström, Erland George-Svahn, John Elvesjö, Mattias Gustavsson
-
Publication number: 20200174561Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.Type: ApplicationFiled: June 18, 2019Publication date: June 4, 2020Applicant: Tobii ABInventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö