Patents Assigned to TOBII AB
  • Patent number: 11156831
    Abstract: An eye-tracking system for performing a pupil-detection process, the eye-tracking system configured to: receive image-data comprising a plurality of pixel-arrays, each pixel-array having a plurality of pixel locations and an intensity-value at each of the pixel locations; for each pixel location of a region of pixel locations: define an intensity-value-set comprising the intensity-values at the pixel location for two or more of the plurality of pixel-arrays; and determine the pixel location to be an excluded pixel location if the intensity-value-set does not satisfy an intensity condition; and exclude the excluded pixel locations from the pupil-detection process.
    Type: Grant
    Filed: December 31, 2019
    Date of Patent: October 26, 2021
    Assignee: Tobii AB
    Inventors: Mikael Rosell, Simon Johansson, Johannes Kron
  • Patent number: 11144755
    Abstract: Methods and corresponding systems of controlling illuminators in an eye tracking system are disclosed. The system includes a first image sensor, a second image sensor, a first close illuminator arranged to capture bright pupil images by the first image sensor, a second close illuminator arranged to capture bright pupil images by the second image sensor and one or more far illuminators arranged to capture dark pupil images by the first image sensor and the second image sensor. In the methods main and support illuminators are controlled during exposure of a first and a second image sensor to produce enhanced contrast and glint position for eye/gaze tracking.
    Type: Grant
    Filed: March 28, 2019
    Date of Patent: October 12, 2021
    Assignee: Tobii AB
    Inventors: Jonas Sjöstrand, Anders Dahl, Mattias I Karlsson
  • Patent number: 11138428
    Abstract: According to the invention, an image sensor is disclosed. The image sensor may include a plurality of pixels. Each pixel of a first portion of the plurality of pixels may include a near-infrared filter configured to block red, green, and blue light; and pass near-infrared light. Each pixel of a second portion of the plurality of pixels may be configured to receive at least one of red, green, or blue light; and receive near-infrared light.
    Type: Grant
    Filed: September 23, 2019
    Date of Patent: October 5, 2021
    Assignee: Tobii AB
    Inventors: Mårten Skogö, Peter Blixt, Henrik Jönsson
  • Patent number: 11138429
    Abstract: An eye-tracking system (e.g., a virtual reality or augmented realty headset) can be used for eye tracking and for iris recognition. Illuminators used to illuminate eyes of a user during eye tracking can be selectively powered on and off in connection with capturing image information in order to obtain image information that suitably depicts an iris region of an eye of the user. This image information can be used to recognize the iris region and by so doing authenticate and/or identify the user.
    Type: Grant
    Filed: March 11, 2019
    Date of Patent: October 5, 2021
    Assignee: Tobii AB
    Inventors: Henrik Eskilsson, Mårten Skogö
  • Publication number: 20210303062
    Abstract: A system for determining a gaze point of a user, the system comprising at least one sensor configured to determine at least one signal representative of a variation in a volume of the interior of a user's ear, and a processor configured to determine a direction of eye movement of the user based on the determined signal, and determine a gaze point of the user based on the direction of eye movement. Further, the disclosure relates to a corresponding method.
    Type: Application
    Filed: March 30, 2020
    Publication date: September 30, 2021
    Applicant: Tobii AB
    Inventor: Andrew Muehlhausen
  • Patent number: 11129530
    Abstract: An eye tracking system having circuitry configured to perform a method is disclosed. An estimated radius (r) from an eyeball center to a pupil center in an eye is obtained, and an estimated eyeball center position (e) in the eye in relation to an image sensor for capturing images of the eye is determined, and an image of the eye captured by means of the image sensor, and a position of a representation of the pupil center in the eye in the obtained image is identified. An estimated pupil center position (p?) is then determined based on the estimated eyeball center position (e), the estimated radius (r), and the identified position of the representation of the pupil center in the obtained image.
    Type: Grant
    Filed: September 7, 2018
    Date of Patent: September 28, 2021
    Assignee: Tobii AB
    Inventors: Simon Johansson, Mark Ryan
  • Publication number: 20210287443
    Abstract: A system, a head-mounted device, a computer program, a carrier and a method for positioning of a virtual object in an extended reality view of at least one user are disclosed. In the method gaze points in world space and respective gaze durations for the gaze points are determined for the at least one user by means of gaze-tracking over a duration of time. Furthermore, gaze heatmap data are determined based on the determined gaze points and respective gaze durations, and the virtual object is positioned in the extended reality view in world space based on the determined gaze heatmap data.
    Type: Application
    Filed: June 29, 2020
    Publication date: September 16, 2021
    Applicant: Tobii AB
    Inventor: Sourabh PATERIYA
  • Publication number: 20210286427
    Abstract: A system, a head-mounted device, a computer program, a carrier and a method for adding a virtual object to an extended reality view based on gaze-tracking data for a user are disclosed. In the method the method, one or more volumes of interest in world space are defined. Furthermore, a position of the user in world space is obtained, and a gaze direction and a gaze convergence distance of the user are determined. A gaze point in world space of the user is then determined based on the determined gaze direction and gaze convergence distance of the user. On condition that the determined gaze point in world space is consistent with a volume of interest of the defined one or more one volumes of interest in world space, a virtual object is added to the extended reality view.
    Type: Application
    Filed: June 29, 2020
    Publication date: September 16, 2021
    Applicant: Tobii AB
    Inventor: Sourabh PATERIYA
  • Publication number: 20210278678
    Abstract: Techniques for distributed foveated rendering based on user gaze are described. In an example, an end user device is communicatively coupled with a remote computer and presents images on a display based on gaze data. The user device receives a low resolution background image and high resolution foreground image from the remote computer based on the gaze data. The foreground image is constrained to a foveated region according to the gaze data. The end user device generates a composite image by scaling up the background image and overlaying the foreground image. The composite image is then presented on the display.
    Type: Application
    Filed: July 20, 2019
    Publication date: September 9, 2021
    Applicant: Tobii AB
    Inventor: Ritchie Brannan
  • Publication number: 20210255462
    Abstract: Computer-generated image data is presented on first and second displays of a binocular headset presuming that a user's left and right eyes are located at first and second positions relative to the first and second displays respectively. At least one updated version of the image data is presented, which is rendered presuming that at least one of the user's left and right eyes is located at a position different from the first and second positions respectively in at least one spatial dimension. In response thereto, a user-generated feedback signal is received expressing either: a quality measure of the updated version of the computer-generated image data relative to computer-generated image data presented previously; or a confirmation command. The steps of presenting the updated version of the computer-generated image data and receiving the user-generated feedback signal are repeated until the confirmation command is received.
    Type: Application
    Filed: December 21, 2020
    Publication date: August 19, 2021
    Applicant: Tobii AB
    Inventors: Geoffrey Cooper, Rickard Lundahl, Erik Lindén, Maria Gordon
  • Publication number: 20210255698
    Abstract: A system, a head-mounted device, a computer program, a carrier, and a method for a head-mounted device comprising an eye tracking sensor, for updating an eye tracking model in relation to an eye are disclosed. First sensor data in relation to the eye are obtained by means of the eye tracking sensor. After obtaining the first sensor data, the eye tracking sensor is moved in relation to the eye. After moving the eye tracking sensor, second sensor data in relation to the eye are obtained by means of the eye tracking sensor. The eye tracking model in relation to the eye is then updated based on the first sensor data and the second sensor data.
    Type: Application
    Filed: September 30, 2020
    Publication date: August 19, 2021
    Applicant: Tobii AB
    Inventors: Pravin Kumar Rana, Gerald Bianchi
  • Publication number: 20210256980
    Abstract: Method for voice-based interactive communication using a digital assistant, wherein the method comprises, an attention detection step, in which the digital assistant detects a user attention and as a result is set into a listening mode; a speaker detection step, in which the digital assistant detects the user as a current speaker; a speech sound detection step, in which the digital assistant detects and records speech uttered by the current speaker, which speech sound detection step further comprises a lip movement detection step, in which the digital assistant detects a lip movement of the current speaker; a speech analysis step, in which the digital assistant parses said recorded speech and extracts speech-based verbal informational content from said recorded speech; and a subsequent response step, in which the digital assistant provides feed-back to the user based on said recorded speech
    Type: Application
    Filed: December 21, 2020
    Publication date: August 19, 2021
    Applicant: Tobii AB
    Inventors: Erland George-Svahn, Sourabh PATERIYA, Onur Kurt, Deepak Akkil
  • Publication number: 20210256353
    Abstract: Techniques for using a deep generative model to generate synthetic data sets that can be used to boost the performance of a discriminative model are described. In an example, an autoencoding generative adversarial network (AEGAN) is trained to generate the synthetic data sets. The AEGAN includes an autoencoding network and a generative adversarial network (GAN) that share a generator. The generator learns how to the generate synthetic data sets based on a data distribution from a latent space. Upon training the AEGAN, the generator generates the synthetic data sets. In turn, the synthetic data sets arc used to train a predictive model, such as a convolutional neural network for gaze prediction.
    Type: Application
    Filed: May 13, 2019
    Publication date: August 19, 2021
    Applicant: Tobii AB
    Inventor: Mårten Nilsson
  • Publication number: 20210256715
    Abstract: A computer-implemented method of selecting a sequence of images of a user's eye for an eye tracking application wherein each image is captured when a stationary stimulus point is displayed to the user, the method comprising: for a plurality of different pairs of the images: comparing the pair of images with each other to determine an image-score that represents a degree of difference between the compared images; and calculating a sequence-score based on the image-scores for the plurality of pairs of images.
    Type: Application
    Filed: September 30, 2020
    Publication date: August 19, 2021
    Applicant: Tobii AB
    Inventors: Mark Ryan, Oscar Lundqvist, Oscar Nyman
  • Publication number: 20210255700
    Abstract: The present invention provides improved methods and systems for assisting a user when interacting with a graphical user interface by combining gaze based input with gesture based user commands. The present invention provide systems, devices and method that enable a user of a computer system without a traditional touch-screen to interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. Furthermore, the present invention offers a solution for touch-screen like interaction using gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen, such as for instance in situations where interaction with the regular touch-screen is cumbersome or ergonomically challenging.
    Type: Application
    Filed: October 23, 2020
    Publication date: August 19, 2021
    Applicant: Tobii AB
    Inventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö
  • Publication number: 20210258464
    Abstract: There is provided a method, system, and non-transitory computer-readable storage medium for controlling the exposure settings of an rolling shutter image sensor device with global reset. This is achieved by obtaining a first image captured by the image sensor device at a current exposure setting that comprises a partial readout parameter representing a number image parts for partial readout by the image sensor device; determining an intensity value of the first image, comparing the intensity value of the first image to a desired intensity value. If the intensity values differ more than an allowed deviation, an updated number of image parts for partial readout is determined based on the current number of image parts and the intensity value of the first image. Thereafter, the current exposure setting is updated by setting the value of the partial readout parameter to the updated number of image parts.
    Type: Application
    Filed: December 21, 2020
    Publication date: August 19, 2021
    Applicant: Tobii AB
    Inventors: Viktor Åberg, Niklas Ollesson, Anna Redz, Magnus Ivarsson
  • Publication number: 20210255699
    Abstract: An eyetracker obtains input signal components (SCR, SP) describing a respective position of each of at least one glint in a subject's eye and a position of a pupil of said eye. Based on the input signal components (SCR, SP), the eyetracker determines if a saccade is in progress, i.e. if the gaze point of the subject's eye moves rapidly from a first point (GP1) to a second point (GP2) where the gaze point is fixed. During the saccade, the eyetracker generates a tracking signal describing the gaze point of the eye based on a subset (SCR) of the input signal components, which subset (SCR) describes a cornea reference point for a subject's eye (E). After the saccade, however, the tracking signal is preferably again based on all the input signal components (SCR, SP).
    Type: Application
    Filed: September 30, 2020
    Publication date: August 19, 2021
    Applicant: Tobii AB
    Inventor: Richard Andersson
  • Patent number: 11089254
    Abstract: Techniques for reducing a read out time and power consumption of an image sensor used for eye tracking are described. In an example, a position of an eye element in an active area of a sensor is determined. The eye element can be any of an eye, a pupil of the eye, an iris of the eye, or a glint at the eye. A region of interest (ROI) around the position of the eye is defined. The image sensor reads out pixels confined to the ROI, thereby generating an ROI image that shows the eye element.
    Type: Grant
    Filed: February 20, 2020
    Date of Patent: August 10, 2021
    Assignee: Tobii AB
    Inventors: Magnus Ivarsson, Per-Edvin Stoltz, David Masko, Niklas Ollesson, Mårten Skogö, Peter Blixt, Henrik Jönsson
  • Patent number: 11073908
    Abstract: A method for mapping an input device to a virtual object in virtual space displayed on a display device is disclosed. The method may include determining, via an eye tracking device, a gaze direction of a user. The method may also include, based at least in part on the gaze direction being directed to a virtual object in virtual space displayed on a display device, modifying an action to be taken by one or more processors in response to receiving a first input from an input device. The method may further include, thereafter, in response to receiving the input from the input device, causing the action to occur, wherein the action correlates the first input to an interaction with the virtual object.
    Type: Grant
    Filed: February 20, 2020
    Date of Patent: July 27, 2021
    Assignee: Tobii AB
    Inventors: Simon Gustafsson, Alexey Bezugly, Anders Kingbäck, Anders Clausen
  • Patent number: 11061471
    Abstract: The present invention relates to a method for establishing the position of an object in relation to a camera in order to enable gaze tracking with a user watching the object, where the user is in view of the camera. The method comprises the steps of showing a known pattern, consisting of a set of stimulus points (s1, s2, . . . , sN), on the object, detecting gaze rays (g1, g2, . . . , gN) from an eye of the user as the user looks at the stimulus points (s1, s2, . . . , sN), and finding, by means of an optimizer, a position and orientation of the object in relation to the camera such that the gaze rays (g1, g2, . . . , gN) approaches the stimulus points (s1, s2, . . . , sN).
    Type: Grant
    Filed: December 11, 2019
    Date of Patent: July 13, 2021
    Assignee: Tobii AB
    Inventor: Erik Lindén