Patents Assigned to TOBII AB
-
Publication number: 20210256715Abstract: A computer-implemented method of selecting a sequence of images of a user's eye for an eye tracking application wherein each image is captured when a stationary stimulus point is displayed to the user, the method comprising: for a plurality of different pairs of the images: comparing the pair of images with each other to determine an image-score that represents a degree of difference between the compared images; and calculating a sequence-score based on the image-scores for the plurality of pairs of images.Type: ApplicationFiled: September 30, 2020Publication date: August 19, 2021Applicant: Tobii ABInventors: Mark Ryan, Oscar Lundqvist, Oscar Nyman
-
Publication number: 20210255698Abstract: A system, a head-mounted device, a computer program, a carrier, and a method for a head-mounted device comprising an eye tracking sensor, for updating an eye tracking model in relation to an eye are disclosed. First sensor data in relation to the eye are obtained by means of the eye tracking sensor. After obtaining the first sensor data, the eye tracking sensor is moved in relation to the eye. After moving the eye tracking sensor, second sensor data in relation to the eye are obtained by means of the eye tracking sensor. The eye tracking model in relation to the eye is then updated based on the first sensor data and the second sensor data.Type: ApplicationFiled: September 30, 2020Publication date: August 19, 2021Applicant: Tobii ABInventors: Pravin Kumar Rana, Gerald Bianchi
-
Patent number: 11089254Abstract: Techniques for reducing a read out time and power consumption of an image sensor used for eye tracking are described. In an example, a position of an eye element in an active area of a sensor is determined. The eye element can be any of an eye, a pupil of the eye, an iris of the eye, or a glint at the eye. A region of interest (ROI) around the position of the eye is defined. The image sensor reads out pixels confined to the ROI, thereby generating an ROI image that shows the eye element.Type: GrantFiled: February 20, 2020Date of Patent: August 10, 2021Assignee: Tobii ABInventors: Magnus Ivarsson, Per-Edvin Stoltz, David Masko, Niklas Ollesson, Mårten Skogö, Peter Blixt, Henrik Jönsson
-
Patent number: 11073908Abstract: A method for mapping an input device to a virtual object in virtual space displayed on a display device is disclosed. The method may include determining, via an eye tracking device, a gaze direction of a user. The method may also include, based at least in part on the gaze direction being directed to a virtual object in virtual space displayed on a display device, modifying an action to be taken by one or more processors in response to receiving a first input from an input device. The method may further include, thereafter, in response to receiving the input from the input device, causing the action to occur, wherein the action correlates the first input to an interaction with the virtual object.Type: GrantFiled: February 20, 2020Date of Patent: July 27, 2021Assignee: Tobii ABInventors: Simon Gustafsson, Alexey Bezugly, Anders Kingbäck, Anders Clausen
-
Patent number: 11061471Abstract: The present invention relates to a method for establishing the position of an object in relation to a camera in order to enable gaze tracking with a user watching the object, where the user is in view of the camera. The method comprises the steps of showing a known pattern, consisting of a set of stimulus points (s1, s2, . . . , sN), on the object, detecting gaze rays (g1, g2, . . . , gN) from an eye of the user as the user looks at the stimulus points (s1, s2, . . . , sN), and finding, by means of an optimizer, a position and orientation of the object in relation to the camera such that the gaze rays (g1, g2, . . . , gN) approaches the stimulus points (s1, s2, . . . , sN).Type: GrantFiled: December 11, 2019Date of Patent: July 13, 2021Assignee: Tobii ABInventor: Erik Lindén
-
Patent number: 11061473Abstract: A method of updating a cornea model for a cornea of an eye is disclosed, as well as a corresponding system and storage medium. The method comprises controlling a display to display a stimulus at a first depth, wherein the display is capable of displaying objects at different depths, receiving first sensor data obtained by an eye tracking sensor while the stimulus is displayed at the first depth by the display, controlling the display to display a stimulus at a second depth, wherein the second depth is different than the first depth, receiving second sensor data obtained by the eye tracking sensor while the stimulus is displayed at the second depth by the display, and updating the cornea model based on the first sensor data and the second sensor data.Type: GrantFiled: March 30, 2020Date of Patent: July 13, 2021Assignee: Tobii ABInventors: Mark Ryan, Jonas Sjöstrand, Erik Lindén, Pravin Rana
-
Publication number: 20210208403Abstract: A method for controlling the transparency level of a transparent displaying device arranged to display one or more virtual objects, the method comprising the steps of: obtaining a gaze point of a user; obtaining a position of at least one virtual object displayed by the displaying device; determining whether the attention of the user is directed to the virtual object based on the obtained gaze point; and if so, adjusting a transparency level of the displaying device. A system operative for controlling the transparency level in a displaying device, as well as a displaying device comprising such a system is also disclosed.Type: ApplicationFiled: December 17, 2020Publication date: July 8, 2021Applicant: Tobii ABInventor: Henrik Andersson
-
Patent number: 11054903Abstract: An eyetracker obtains a digital image representing at least one eye of a subject. The eyetracker then searches for pupil candidates in the digital image according to a search algorithm and, based on the searching, determines a position for the at least one eye in the digital image. The eyetracker also obtains light-intensity information expressing an estimated amount of light energy exposing the at least one eye when registering the digital image. In response to the light-intensity information, the eye-tracker determines a range of pupil sizes. The search algorithm applies the range of pupil sizes in such a manner that a detected pupil candidate must have size within the range of pupil sizes to be accepted by the search algorithm as a valid pupil of the subject.Type: GrantFiled: June 26, 2020Date of Patent: July 6, 2021Assignee: Tobii ABInventor: Richard Andersson
-
Publication number: 20210199957Abstract: An eye-tracking system for performing a pupil-detection process, the eye-tracking system configured to: receive image-data comprising a plurality of pixel-arrays, each pixel-array having a plurality of pixel locations and an intensity-value at each of the pixel locations; for each pixel location of a region of pixel locations: define an intensity-value-set comprising the intensity-values at the pixel location for two or more of the plurality of pixel-arrays; and determine the pixel location to be an excluded pixel location if the intensity-value-set does not satisfy an intensity condition; and exclude the excluded pixel locations from the pupil-detection process.Type: ApplicationFiled: December 31, 2019Publication date: July 1, 2021Applicant: Tobii ABInventors: Mikael Rosell, Simon Johansson, Johannes Kron
-
Patent number: 11042205Abstract: A personal computer system comprises a visual display, an imaging device adapted to provide eye-tracking data by imaging at least one eye of a viewer of the visual display, and identifying means for recognizing the viewer with reference to one of a plurality of predefined personal profiles. The personal computer system further comprises an eye-tracking processor for processing the eye-tracking data. According to the invention, the eye-tracking processor is selectively operable in one of a plurality of personalized active sub-modes associated with said personal profiles. The sub-modes may differ with regard to eye-tracking related or power-management related settings. Further, the identifying means may sense an identified viewer's actual viewing condition (e.g., use of viewing aids or wearing of garments), wherein the imaging device is further operable in a sub-profile mode associated with the determined actual viewing condition.Type: GrantFiled: October 16, 2017Date of Patent: June 22, 2021Assignee: Tobii ABInventors: John Mikael Elvesjö, Anders Kingbäck, Gunnar Troili, Mårten Skogö, Henrik Eskilsson, Peter Blixt
-
Publication number: 20210173477Abstract: A method for determining gaze calibration parameters for gaze estimation of a viewer using an eye-tracking system. The method comprises obtaining a set of data points including gaze tracking data of the viewer and position information of at least one target visual; selecting a first subset of the data points and determining gaze calibration parameters using said first subset. A score for the gaze calibration parameters is determined by using the gaze calibration parameters with a second subset of data points, wherein at least one data point of the subset is not included in the first subset. The score is indicative of the capability of the gaze calibration parameters to reflect position information of the at least one target visual based on the gaze tracking data. The score is compared to a candidate score and if it is higher, the calibration parameters are set to the candidate calibration parameters and the score to the candidate score.Type: ApplicationFiled: November 16, 2020Publication date: June 10, 2021Applicant: Tobii ABInventors: Patrik Barkman, David Molin
-
Publication number: 20210173479Abstract: A method for detecting an eye event of a user using an eye tracking system, the method comprising capturing a first image of a first eye of a user, capturing an image of a second eye of the user a first period after capturing the first image of the first eye and a second period before capturing a next image of the first eye, capturing a second image of the first eye the second period after capturing the image of the second eye, determining that an eye event has occurred based on a difference between the first and second images of the first eye, and performing at least one action if it is determined that that an eye event has occurred.Type: ApplicationFiled: December 10, 2020Publication date: June 10, 2021Applicant: Tobii ABInventor: Andreas Klingström
-
Patent number: 11023040Abstract: Techniques for interacting with a first computing device based on gaze information are described. In an example, the first computing device captures a gaze direction of a first user of the first computing device by using an eye tracking device. The first computing device determines if the gaze direction of the first user is directed to a first display. Further, the first computing device receives information regarding if a gaze direction of a second user is directed to a second display. If the gaze direction of the first user is directed to the first display and the gaze direction of the second user is directed to the second display, the first computing device continuously updates content on the first display. If the gaze direction of the second user is not directed to the second display, the first computing device pauses the content on the first display.Type: GrantFiled: September 19, 2018Date of Patent: June 1, 2021Assignee: Tobii ABInventors: Daniel Ricknäs, Erland George-Svahn, Rebecka Lannsjö, Andrew Ratcliff, Regimantas Vegele, Geoffrey Cooper, Niklas Blomqvist
-
Publication number: 20210141451Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A computer can enlarge a portion of a display adjacent a first gaze target in response to detecting a first action (e.g., pressing a touchpad). The computer can then allow a user to position a second gaze target in the enlarged portion (e.g., by looking at the desired location) and perform a second action in order to perform a computer function at that location. The enlarging can allow a user to identify a desired location for a computer function (e.g., selecting an icon) with greater precision.Type: ApplicationFiled: January 22, 2021Publication date: May 13, 2021Applicant: Tobii ABInventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
-
Patent number: 11003936Abstract: There is provided a method, system, and non-transitory computer-readable storage medium for controlling an eye tracking system to optimize eye tracking performance under different lighting conditions, by obtaining a first image captured using a camera associated with the eye tracking system, the first image comprising at least part of an iris and at least part of a pupil of an eye illuminated by an illuminator associated with the eye tracking system at a current power of illumination selected from a set of predetermined power levels; determining a contrast value between an the iris and the pupil in the image; and, if the contrast value deviates less than a preset deviation threshold value from a preset minimum contrast value, setting the current power of illumination of the illuminator to the other predetermined power level in the set of predetermined power levels.Type: GrantFiled: June 15, 2020Date of Patent: May 11, 2021Assignee: Tobii ABInventors: Viktor Åberg, Anna Redz, Niklas Ollesson, Dineshkumar Muthusamy, Magnus Ivarsson
-
Patent number: 10996751Abstract: A gaze tracking model is adapted to predict a gaze ray using an image of the eye. The model is trained using training data which comprises a first image of an eye, reference gaze data indicating a gaze point towards which the eye was gazing when the first image was captured, and images of an eye captured by first and second cameras at a point in time. The training comprises forming a distance between the gaze point and a gaze ray predicted by the model using the first image, forming a consistency measure based on a gaze ray predicted by the model using the image captured by the first camera and a gaze ray predicted by the model using the image captured by the second camera, forming an objective function based on at least the formed distance and the consistency measure, and training the model using the objective function.Type: GrantFiled: December 16, 2019Date of Patent: May 4, 2021Assignee: Tobii ABInventors: David Mohlin, Erik Lindén
-
Patent number: 10983359Abstract: Disclosed is a method for switching user input modality of a displaying device displaying an interactable region. The displaying device is in communication with a first input modality and a second input modality. The first input modality is an eye tracker configured to determine gaze data of a user and the second input modality is an input modality other than an eye tracker configured to determine a pointing ray. The first input modality is selected as the input modality of the displaying device. The method comprises determining whether the pointing ray of the second input modality is intersecting with the interactable region. The method further comprises, based on the determining switching the input modality of the displaying device to the second input modality when the pointing ray of the second input modality is intersecting with the interactable region.Type: GrantFiled: December 11, 2019Date of Patent: April 20, 2021Assignee: Tobii ABInventors: Niklas Blomqvist, Robin Thunström, Dennis Rådell, Ralf Biedert
-
Publication number: 20210112107Abstract: Data packets containing gaze data are streamed from an eyetracker to a client via a driver unit by receiving, repeatedly, gaze data packets in a first interface; and, providing, repeatedly, via a second interface, gaze data packets. The client sends a request message to the driver unit. The request message defines a delivery point in time in a first time frame structure at which delivery point in time in each frame of the first time frame structure the gaze data packet shall be provided to the client via the second interface. An offset is calculated between a reception point in time and the delivery point in time. The reception point in time indicates when a gaze data packet is received from the eyetracker relative to the first time structure. An adjusted data acquisition instance is assigned based on the offset. The adjusted data acquisition instance represents a modified point in time in a second time frame structure when at least one future gaze data packet shall be produced by the eyetracker.Type: ApplicationFiled: August 31, 2020Publication date: April 15, 2021Applicant: Tobii ABInventors: Anders Clausen, Daniel Tornéus
-
Publication number: 20210109591Abstract: An augmented reality, virtual reality, or other wearable apparatus comprises an eye tracking device comprising an image sensor, a lens, and one or more processors. In some embodiments, the lens comprises a marker, and the one or more processors are configured to receive an image from the image sensor, wherein the image shows the marker, determine a distance from the image sensor to the marker based on the image, and change a calibration parameter of an eye tracking algorithm based on the distance. In some embodiments, the one or more processors are configured to receive image data from the image sensor, wherein the image data corresponds to an image as observed through the lens, determine a level or pattern of pincushion distortion in the image based on the image data, and change a calibration parameter of an eye tracking algorithm based on the level or the pattern of pincushion distortion.Type: ApplicationFiled: February 15, 2019Publication date: April 15, 2021Applicant: Tobii ABInventors: Jonas Andersson, Anders Clausen, Richard Hainzl, Anders Kingbäck, Simon Olin, Mark Ryan, Daniel Tornéus, Björn Nutti, Torbjörn Sundberg, Catarina Tidbeck, Ralf Biedert, Niklas Blomqvist, Dennis Rådell, Robin Thunström
-
Publication number: 20210099631Abstract: There is provided a method, system, and non-transitory computer-readable storage medium for controlling an eye tracking system (200) to obtain a first image (500) captured under active illumination by at least one infrared, IR, illuminator (112, 113) associated with the eye tracking system (200) and at a current exposure setting, using the image sensor device (110); and, if at least one eye (100, 120) is depicted in the first image (500): define at least one region of interest, ROI, (501, 502, 503, 504) from the first image (500) comprising a group of pixels in the first image (500) representing at least a part of the depicted at least one eye (100, 120); determine a respective intensity value for each of the at least one ROI (501, 502, 503, 504); determine a second exposure setting by adjusting at least one exposure parameter of the image sensor device (110) based on the determined intensity value, or values, for the at least one ROI (501, 502, 503, 504); and set the current exposure setting to the second exType: ApplicationFiled: September 30, 2020Publication date: April 1, 2021Applicant: Tobii ABInventors: Magnus Ivarsson, Niklas Ollesson, Viktor Åberg, Anna Redz