Patents Assigned to TOBII AB
-
Publication number: 20210255462Abstract: Computer-generated image data is presented on first and second displays of a binocular headset presuming that a user's left and right eyes are located at first and second positions relative to the first and second displays respectively. At least one updated version of the image data is presented, which is rendered presuming that at least one of the user's left and right eyes is located at a position different from the first and second positions respectively in at least one spatial dimension. In response thereto, a user-generated feedback signal is received expressing either: a quality measure of the updated version of the computer-generated image data relative to computer-generated image data presented previously; or a confirmation command. The steps of presenting the updated version of the computer-generated image data and receiving the user-generated feedback signal are repeated until the confirmation command is received.Type: ApplicationFiled: December 21, 2020Publication date: August 19, 2021Applicant: Tobii ABInventors: Geoffrey Cooper, Rickard Lundahl, Erik Lindén, Maria Gordon
-
Publication number: 20210255699Abstract: An eyetracker obtains input signal components (SCR, SP) describing a respective position of each of at least one glint in a subject's eye and a position of a pupil of said eye. Based on the input signal components (SCR, SP), the eyetracker determines if a saccade is in progress, i.e. if the gaze point of the subject's eye moves rapidly from a first point (GP1) to a second point (GP2) where the gaze point is fixed. During the saccade, the eyetracker generates a tracking signal describing the gaze point of the eye based on a subset (SCR) of the input signal components, which subset (SCR) describes a cornea reference point for a subject's eye (E). After the saccade, however, the tracking signal is preferably again based on all the input signal components (SCR, SP).Type: ApplicationFiled: September 30, 2020Publication date: August 19, 2021Applicant: Tobii ABInventor: Richard Andersson
-
Publication number: 20210255700Abstract: The present invention provides improved methods and systems for assisting a user when interacting with a graphical user interface by combining gaze based input with gesture based user commands. The present invention provide systems, devices and method that enable a user of a computer system without a traditional touch-screen to interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. Furthermore, the present invention offers a solution for touch-screen like interaction using gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen, such as for instance in situations where interaction with the regular touch-screen is cumbersome or ergonomically challenging.Type: ApplicationFiled: October 23, 2020Publication date: August 19, 2021Applicant: Tobii ABInventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö
-
Publication number: 20210256980Abstract: Method for voice-based interactive communication using a digital assistant, wherein the method comprises, an attention detection step, in which the digital assistant detects a user attention and as a result is set into a listening mode; a speaker detection step, in which the digital assistant detects the user as a current speaker; a speech sound detection step, in which the digital assistant detects and records speech uttered by the current speaker, which speech sound detection step further comprises a lip movement detection step, in which the digital assistant detects a lip movement of the current speaker; a speech analysis step, in which the digital assistant parses said recorded speech and extracts speech-based verbal informational content from said recorded speech; and a subsequent response step, in which the digital assistant provides feed-back to the user based on said recorded speechType: ApplicationFiled: December 21, 2020Publication date: August 19, 2021Applicant: Tobii ABInventors: Erland George-Svahn, Sourabh PATERIYA, Onur Kurt, Deepak Akkil
-
Publication number: 20210258464Abstract: There is provided a method, system, and non-transitory computer-readable storage medium for controlling the exposure settings of an rolling shutter image sensor device with global reset. This is achieved by obtaining a first image captured by the image sensor device at a current exposure setting that comprises a partial readout parameter representing a number image parts for partial readout by the image sensor device; determining an intensity value of the first image, comparing the intensity value of the first image to a desired intensity value. If the intensity values differ more than an allowed deviation, an updated number of image parts for partial readout is determined based on the current number of image parts and the intensity value of the first image. Thereafter, the current exposure setting is updated by setting the value of the partial readout parameter to the updated number of image parts.Type: ApplicationFiled: December 21, 2020Publication date: August 19, 2021Applicant: Tobii ABInventors: Viktor Åberg, Niklas Ollesson, Anna Redz, Magnus Ivarsson
-
Publication number: 20210256353Abstract: Techniques for using a deep generative model to generate synthetic data sets that can be used to boost the performance of a discriminative model are described. In an example, an autoencoding generative adversarial network (AEGAN) is trained to generate the synthetic data sets. The AEGAN includes an autoencoding network and a generative adversarial network (GAN) that share a generator. The generator learns how to the generate synthetic data sets based on a data distribution from a latent space. Upon training the AEGAN, the generator generates the synthetic data sets. In turn, the synthetic data sets arc used to train a predictive model, such as a convolutional neural network for gaze prediction.Type: ApplicationFiled: May 13, 2019Publication date: August 19, 2021Applicant: Tobii ABInventor: Mårten Nilsson
-
Publication number: 20210255698Abstract: A system, a head-mounted device, a computer program, a carrier, and a method for a head-mounted device comprising an eye tracking sensor, for updating an eye tracking model in relation to an eye are disclosed. First sensor data in relation to the eye are obtained by means of the eye tracking sensor. After obtaining the first sensor data, the eye tracking sensor is moved in relation to the eye. After moving the eye tracking sensor, second sensor data in relation to the eye are obtained by means of the eye tracking sensor. The eye tracking model in relation to the eye is then updated based on the first sensor data and the second sensor data.Type: ApplicationFiled: September 30, 2020Publication date: August 19, 2021Applicant: Tobii ABInventors: Pravin Kumar Rana, Gerald Bianchi
-
Publication number: 20210256715Abstract: A computer-implemented method of selecting a sequence of images of a user's eye for an eye tracking application wherein each image is captured when a stationary stimulus point is displayed to the user, the method comprising: for a plurality of different pairs of the images: comparing the pair of images with each other to determine an image-score that represents a degree of difference between the compared images; and calculating a sequence-score based on the image-scores for the plurality of pairs of images.Type: ApplicationFiled: September 30, 2020Publication date: August 19, 2021Applicant: Tobii ABInventors: Mark Ryan, Oscar Lundqvist, Oscar Nyman
-
Patent number: 11089254Abstract: Techniques for reducing a read out time and power consumption of an image sensor used for eye tracking are described. In an example, a position of an eye element in an active area of a sensor is determined. The eye element can be any of an eye, a pupil of the eye, an iris of the eye, or a glint at the eye. A region of interest (ROI) around the position of the eye is defined. The image sensor reads out pixels confined to the ROI, thereby generating an ROI image that shows the eye element.Type: GrantFiled: February 20, 2020Date of Patent: August 10, 2021Assignee: Tobii ABInventors: Magnus Ivarsson, Per-Edvin Stoltz, David Masko, Niklas Ollesson, Mårten Skogö, Peter Blixt, Henrik Jönsson
-
Patent number: 11073908Abstract: A method for mapping an input device to a virtual object in virtual space displayed on a display device is disclosed. The method may include determining, via an eye tracking device, a gaze direction of a user. The method may also include, based at least in part on the gaze direction being directed to a virtual object in virtual space displayed on a display device, modifying an action to be taken by one or more processors in response to receiving a first input from an input device. The method may further include, thereafter, in response to receiving the input from the input device, causing the action to occur, wherein the action correlates the first input to an interaction with the virtual object.Type: GrantFiled: February 20, 2020Date of Patent: July 27, 2021Assignee: Tobii ABInventors: Simon Gustafsson, Alexey Bezugly, Anders Kingbäck, Anders Clausen
-
Patent number: 11061473Abstract: A method of updating a cornea model for a cornea of an eye is disclosed, as well as a corresponding system and storage medium. The method comprises controlling a display to display a stimulus at a first depth, wherein the display is capable of displaying objects at different depths, receiving first sensor data obtained by an eye tracking sensor while the stimulus is displayed at the first depth by the display, controlling the display to display a stimulus at a second depth, wherein the second depth is different than the first depth, receiving second sensor data obtained by the eye tracking sensor while the stimulus is displayed at the second depth by the display, and updating the cornea model based on the first sensor data and the second sensor data.Type: GrantFiled: March 30, 2020Date of Patent: July 13, 2021Assignee: Tobii ABInventors: Mark Ryan, Jonas Sjöstrand, Erik Lindén, Pravin Rana
-
Patent number: 11061471Abstract: The present invention relates to a method for establishing the position of an object in relation to a camera in order to enable gaze tracking with a user watching the object, where the user is in view of the camera. The method comprises the steps of showing a known pattern, consisting of a set of stimulus points (s1, s2, . . . , sN), on the object, detecting gaze rays (g1, g2, . . . , gN) from an eye of the user as the user looks at the stimulus points (s1, s2, . . . , sN), and finding, by means of an optimizer, a position and orientation of the object in relation to the camera such that the gaze rays (g1, g2, . . . , gN) approaches the stimulus points (s1, s2, . . . , sN).Type: GrantFiled: December 11, 2019Date of Patent: July 13, 2021Assignee: Tobii ABInventor: Erik Lindén
-
Publication number: 20210208403Abstract: A method for controlling the transparency level of a transparent displaying device arranged to display one or more virtual objects, the method comprising the steps of: obtaining a gaze point of a user; obtaining a position of at least one virtual object displayed by the displaying device; determining whether the attention of the user is directed to the virtual object based on the obtained gaze point; and if so, adjusting a transparency level of the displaying device. A system operative for controlling the transparency level in a displaying device, as well as a displaying device comprising such a system is also disclosed.Type: ApplicationFiled: December 17, 2020Publication date: July 8, 2021Applicant: Tobii ABInventor: Henrik Andersson
-
Patent number: 11054903Abstract: An eyetracker obtains a digital image representing at least one eye of a subject. The eyetracker then searches for pupil candidates in the digital image according to a search algorithm and, based on the searching, determines a position for the at least one eye in the digital image. The eyetracker also obtains light-intensity information expressing an estimated amount of light energy exposing the at least one eye when registering the digital image. In response to the light-intensity information, the eye-tracker determines a range of pupil sizes. The search algorithm applies the range of pupil sizes in such a manner that a detected pupil candidate must have size within the range of pupil sizes to be accepted by the search algorithm as a valid pupil of the subject.Type: GrantFiled: June 26, 2020Date of Patent: July 6, 2021Assignee: Tobii ABInventor: Richard Andersson
-
Publication number: 20210199957Abstract: An eye-tracking system for performing a pupil-detection process, the eye-tracking system configured to: receive image-data comprising a plurality of pixel-arrays, each pixel-array having a plurality of pixel locations and an intensity-value at each of the pixel locations; for each pixel location of a region of pixel locations: define an intensity-value-set comprising the intensity-values at the pixel location for two or more of the plurality of pixel-arrays; and determine the pixel location to be an excluded pixel location if the intensity-value-set does not satisfy an intensity condition; and exclude the excluded pixel locations from the pupil-detection process.Type: ApplicationFiled: December 31, 2019Publication date: July 1, 2021Applicant: Tobii ABInventors: Mikael Rosell, Simon Johansson, Johannes Kron
-
Patent number: 11042205Abstract: A personal computer system comprises a visual display, an imaging device adapted to provide eye-tracking data by imaging at least one eye of a viewer of the visual display, and identifying means for recognizing the viewer with reference to one of a plurality of predefined personal profiles. The personal computer system further comprises an eye-tracking processor for processing the eye-tracking data. According to the invention, the eye-tracking processor is selectively operable in one of a plurality of personalized active sub-modes associated with said personal profiles. The sub-modes may differ with regard to eye-tracking related or power-management related settings. Further, the identifying means may sense an identified viewer's actual viewing condition (e.g., use of viewing aids or wearing of garments), wherein the imaging device is further operable in a sub-profile mode associated with the determined actual viewing condition.Type: GrantFiled: October 16, 2017Date of Patent: June 22, 2021Assignee: Tobii ABInventors: John Mikael Elvesjö, Anders Kingbäck, Gunnar Troili, Mårten Skogö, Henrik Eskilsson, Peter Blixt
-
Publication number: 20210173479Abstract: A method for detecting an eye event of a user using an eye tracking system, the method comprising capturing a first image of a first eye of a user, capturing an image of a second eye of the user a first period after capturing the first image of the first eye and a second period before capturing a next image of the first eye, capturing a second image of the first eye the second period after capturing the image of the second eye, determining that an eye event has occurred based on a difference between the first and second images of the first eye, and performing at least one action if it is determined that that an eye event has occurred.Type: ApplicationFiled: December 10, 2020Publication date: June 10, 2021Applicant: Tobii ABInventor: Andreas Klingström
-
Publication number: 20210173477Abstract: A method for determining gaze calibration parameters for gaze estimation of a viewer using an eye-tracking system. The method comprises obtaining a set of data points including gaze tracking data of the viewer and position information of at least one target visual; selecting a first subset of the data points and determining gaze calibration parameters using said first subset. A score for the gaze calibration parameters is determined by using the gaze calibration parameters with a second subset of data points, wherein at least one data point of the subset is not included in the first subset. The score is indicative of the capability of the gaze calibration parameters to reflect position information of the at least one target visual based on the gaze tracking data. The score is compared to a candidate score and if it is higher, the calibration parameters are set to the candidate calibration parameters and the score to the candidate score.Type: ApplicationFiled: November 16, 2020Publication date: June 10, 2021Applicant: Tobii ABInventors: Patrik Barkman, David Molin
-
Patent number: 11023040Abstract: Techniques for interacting with a first computing device based on gaze information are described. In an example, the first computing device captures a gaze direction of a first user of the first computing device by using an eye tracking device. The first computing device determines if the gaze direction of the first user is directed to a first display. Further, the first computing device receives information regarding if a gaze direction of a second user is directed to a second display. If the gaze direction of the first user is directed to the first display and the gaze direction of the second user is directed to the second display, the first computing device continuously updates content on the first display. If the gaze direction of the second user is not directed to the second display, the first computing device pauses the content on the first display.Type: GrantFiled: September 19, 2018Date of Patent: June 1, 2021Assignee: Tobii ABInventors: Daniel Ricknäs, Erland George-Svahn, Rebecka Lannsjö, Andrew Ratcliff, Regimantas Vegele, Geoffrey Cooper, Niklas Blomqvist
-
Publication number: 20210141451Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A computer can enlarge a portion of a display adjacent a first gaze target in response to detecting a first action (e.g., pressing a touchpad). The computer can then allow a user to position a second gaze target in the enlarged portion (e.g., by looking at the desired location) and perform a second action in order to perform a computer function at that location. The enlarging can allow a user to identify a desired location for a computer function (e.g., selecting an icon) with greater precision.Type: ApplicationFiled: January 22, 2021Publication date: May 13, 2021Applicant: Tobii ABInventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö