Patents Assigned to TOBII AB
  • Patent number: 10591990
    Abstract: A gaze tracking system, leaving a low power mode in response to an activation signal, initial burst of eye pictures in short time by restricting the image area of a sensor, purpose of enabling an increased frame rate. Subsequent eye pictures are captured at le rate. The first gaze point value is computed memorylessly based on the initial burst res and no additional imagery, while subsequent values may be computed recursively to account previous gaze point values or information from previous eye pictures. The restriction of the image area may be guided by a preliminary overview picture captured using the different sensor. From the gaze point values, the system may derive a control signal to a computer device with a visual display.
    Type: Grant
    Filed: December 26, 2018
    Date of Patent: March 17, 2020
    Assignee: Tobii AB
    Inventors: Mårten Skogö, Anders Olsson, John Mikael Elvesjö, Aron Yu
  • Patent number: 10594974
    Abstract: Techniques for reducing a read out time and power consumption of an image sensor used for eye tracking are described. In an example, a position of an eye element in an active area of a sensor is determined. The eye element can be any of an eye, a pupil of the eye, an iris of the eye, or a glint at the eye. A region of interest (ROI) around the position of the eye is defined. The image sensor reads out pixels confined to the ROI, thereby generating an ROI image that shows the eye element.
    Type: Grant
    Filed: May 10, 2018
    Date of Patent: March 17, 2020
    Assignee: Tobii AB
    Inventors: Magnus Ivarsson, Per-Edvin Stoltz, David Masko, Niklas Ollesson, Mårten Skogö, Peter Blixt, Henrik Jönsson
  • Patent number: 10585277
    Abstract: According to the invention, a system for tracking a gaze of a user across a multi-display arrangement is disclosed. The system may include a first display, a first eye tracking device, a second display, a second eye tracking device, and a processor. The first eye tracking device may be configured to determine a user's gaze direction while the user is gazing at the first display. The second eye tracking device may be configured to determine the user's gaze direction while the user is gazing at the second display. The processor may be configured to determine that the user's gaze has moved away from the first display in a direction of the second display, and in response to determining that the user's gaze has moved away from the first display in the direction of the second display, deactivate the first eye tracking device, and activate the second eye tracking device.
    Type: Grant
    Filed: August 31, 2017
    Date of Patent: March 10, 2020
    Assignee: Tobii AB
    Inventors: Farshid Bagherpour, Mårten Skogö
  • Publication number: 20200076998
    Abstract: A portable eye tracker device is disclosed which includes a frame, at least one optics holding member, and a control unit. The frame may be adapted for wearing by a user. The at least one optics holding member may include at least one illuminator configured to selectively illuminate at least a portion of at least one eye of the user, and at least one image sensor configured to capture image data representing images of at least a portion of at least one eye of the user. The control unit may be configured to control the at least one illuminator for the selective illumination of at least a portion of at least one eye of the user, receive the image data from the at least one image sensor, and calibrate at least one illuminator, at least one image sensor, or an algorithm of the control unit.
    Type: Application
    Filed: June 19, 2019
    Publication date: March 5, 2020
    Applicant: Tobii AB
    Inventors: Simon Gusstafsson, Anders Kingbäck, Markus Cederlund
  • Patent number: 10579142
    Abstract: A method for determining if a user's gaze is directed in the direction of a zone of interest in a 3D scene comprises: providing a 3D scene containing a zone of interest; associating a property with the zone of interest; creating a bitmap representing the location of the zone of interest in a projected view of the 3D scene, each pixel of the bitmap to which the zone of interest is projected storing the property of the zone of interest; detecting the direction of the user's gaze; using the bitmap to determine if the detected user's gaze is directed in the direction of the zone of interest.
    Type: Grant
    Filed: January 24, 2019
    Date of Patent: March 3, 2020
    Assignee: Tobii AB
    Inventors: Fredrik Lindh, Mattias Gustavsson, Anders Vennstrom, Andreas Edling
  • Patent number: 10572008
    Abstract: At least one image registering unit records at least one series of images representing a subject. A control unit controls an operation sequence for the at least one image registering unit in such a manner that a subsequent data processing unit receives a repeating sequence of image frames there from, wherein each period contains at least one image frame of a first resolution and at least one image frame of a second resolution being different from the first resolution. Based on the registered image frames, the data processing unit produces eye/gaze tracking data with respect to the subject.
    Type: Grant
    Filed: December 20, 2017
    Date of Patent: February 25, 2020
    Assignee: Tobii AB
    Inventors: Mattias Kuldkepp, Marten Skogo, Mattias Hanqvist, Martin Brogren, Dineshkumar Muthusamy
  • Patent number: 10565446
    Abstract: A system for determining a gaze direction of a user of a wearable device is disclosed. The system may include a primary lens, an illuminator, an image sensor, and an interface. The illuminator may include a light guide, may be disposed at least partially on or in the primary lens, and may be configured to illuminate at least one eye of a user. The image sensor may be disposed on or in the primary lens, and may be configured to detect light reflected by the at least one eye of the user. The interface may be configured to provide data from the image sensor to a processor for determining a gaze direction of the user based at least in part on light detected by the image sensor.
    Type: Grant
    Filed: July 17, 2017
    Date of Patent: February 18, 2020
    Assignee: Tobii AB
    Inventors: Simon Gustafsson, Anders Kingbäck, Peter Blixt, Richard Hainzl, Mårten Skogö
  • Publication number: 20200050267
    Abstract: The present invention relates to control of a computer system, which includes a data processing unit, a display and an eye tracker adapted to register a user's gaze point with respect to the display. The data processing unit is adapted to present graphical information on the display, which includes feedback data reflecting the user's commands entered into the unit. The data processing unit is adapted to present the feedback data such that during an initial phase, the feedback data is generated based on an absolute position of the gaze point. An imaging device of the system is also adapted to register image data representing movements of a body part of the user and to forward a representation of the image data to the data processing unit. Hence, during a phase subsequent to the initial phase, the data is instead generated based on the image data.
    Type: Application
    Filed: October 22, 2019
    Publication date: February 13, 2020
    Applicant: Tobii AB
    Inventors: John Elvesjö, Anders Olsson, Johan Sahlén
  • Patent number: 10558262
    Abstract: According to the invention, a method for changing a display based at least in part on a gaze point of a user on the display is disclosed. The method may include receiving information identifying a location of the gaze point of the user on the display. The method may also include, based at least in part on the location of the gaze point, causing a virtual camera perspective to change, thereby causing content on the display associated with the virtual camera to change.
    Type: Grant
    Filed: January 20, 2015
    Date of Patent: February 11, 2020
    Assignee: Tobii AB
    Inventor: Rebecka Lannsjö
  • Patent number: 10558895
    Abstract: Techniques for generating 3D gaze predictions based on a deep learning system are described. In an example, the deep learning system includes a neural network. A scaled image is generated from 2D image showing a user face based on a rough distance between the user eyes and a camera that generated the 2D image. Image crops at different resolutions are generated from the scaled image and include a crop around each of the user eyes and a crop around the user face. These crops are input to the neural network. In response, the neural network outputs a distance correction and a 2D gaze vector per user eye. A corrected eye-to-camera distance is generated by correcting the rough distance based on the distance correction. A 3D gaze vector for each of the user eyes is generated based on the corresponding 2D gaze vector and the corrected distance.
    Type: Grant
    Filed: March 30, 2018
    Date of Patent: February 11, 2020
    Assignee: Tobii AB
    Inventor: Erik Linden
  • Patent number: 10551915
    Abstract: According to the invention, a method for entering text into a computing device using gaze input from a user is disclosed. The method may include causing a display device to display a visual representation of a plurality of letters. The method may also include receiving gaze information identifying a movement of the user's gaze on the visual representation. The method may further include recording an observation sequence of one or more observation events that occur during the movement of the user's gaze on the visual representation. The method may additionally include providing the observation sequence to a decoder module. The decoder module may determine at least one word from the observation sequence representing an estimate of an intended text of the user.
    Type: Grant
    Filed: August 22, 2018
    Date of Patent: February 4, 2020
    Assignee: Tobii AB
    Inventors: Per Ola Kristensson, Keith Vertanen, Morten Mjelde
  • Patent number: 10545574
    Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A visual indicator can be presented on a display to indicate the location where a computer function will take place (e.g., a common cursor). The visual indicator can be moved to a gaze target in response to continued detection of an action (e.g., touchpad touch) by a user for a predetermined period of time. A delay between the action and the movement of the visual indicator can allow a user time to “abort” movement of the visual indicator. Additionally, once the visual indicator has moved, the visual indicator can be controlled with additional precision as the user moves the gaze while continuing the action (e.g., continued holding of the touchpad).
    Type: Grant
    Filed: March 3, 2017
    Date of Patent: January 28, 2020
    Assignee: Tobii AB
    Inventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
  • Publication number: 20200026068
    Abstract: A method for determining eye openness with an eye tracking device is disclosed. The method may include determining, for pixels of an image sensor of an eye tracking device, during a first time period when an eye of a user is open, a first sum of intensity of the pixels. The method may also include determining, during a second time period when the eye of the user is closed, a second sum of intensity of the pixels. The method may further include determining, during a third time period, a third sum of intensity of the pixels. The method may additionally include determining that upon the third sum exceeding a fourth sum of the first sum plus a threshold amount, that the eye of the user is closed, the threshold amount is equal to a product of a threshold fraction and a difference between the first sum and the second sum.
    Type: Application
    Filed: June 24, 2019
    Publication date: January 23, 2020
    Applicant: Tobii AB
    Inventors: Mark Ryan, Torbjörn Sundberg, Pravin Rana, Yimu Wang
  • Patent number: 10540008
    Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.
    Type: Grant
    Filed: March 31, 2016
    Date of Patent: January 21, 2020
    Assignee: Tobii AB
    Inventors: Andreas Klingström, Mårten Skogö, Richard Hainzl, John Elvesjö
  • Publication number: 20200019239
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: June 25, 2019
    Publication date: January 16, 2020
    Applicant: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Patent number: 10534526
    Abstract: Disclosed are various embodiments for automatic scrolling of content displayed on a display device in response to gaze detection. Content may be displayed in a window rendered on a display screen. Gaze detection components may be used to detect that a user is gazing at the displayed content and to determine a gaze point relative to the display screen. At least one applicable scroll zone relative to the display screen and a scroll action associated with each applicable scroll zone may be determined. In response to determining that the gaze point is within a first applicable scroll zone, an associated first scroll action may be initiated. The first scroll action causes the content to scroll within the window until at least one of: expiration of a defined period, determining that a portion of the content scrolls past a defined position within the window, determining that the gaze point is outside of the first scroll zone, and detecting an indicator that the user begins reading the content.
    Type: Grant
    Filed: December 21, 2017
    Date of Patent: January 14, 2020
    Assignee: Tobii AB
    Inventors: Anders Olsson, Mårten Skogö
  • Patent number: 10534982
    Abstract: Techniques for generating 3D gaze predictions based on a deep learning system are described. In an example, the deep learning system includes a neural network. The neural network is trained with training images. During the training, calibration parameters are initialized and input to the neural network, and are updated through the training. Accordingly, the network parameters of the neural network are updated based in part on the calibration parameters. Upon completion of the training, the neural network is calibrated for a user. This calibration includes initializing and inputting the calibration parameters along with calibration images showing an eye of the user to the neural network. The calibration includes updating the calibration parameters without changing the network parameters by minimizing the loss function of the neural network based on the calibration images. Upon completion of the calibration, the neural network is used to generate 3D gaze information for the user.
    Type: Grant
    Filed: March 30, 2018
    Date of Patent: January 14, 2020
    Assignee: Tobii AB
    Inventor: Erik Linden
  • Patent number: 10528131
    Abstract: A system and techniques for calibrating an eye tracking system are described. The system can update the calibration of personal calibration parameters continuously based on a user's gaze on a user interface, following user interface stimulus events. The system improves continuous calibration techniques by determining an association between the user's eye sequences and the stimulus events, and updates the personal calibration parameters accordingly. A record indicative of a user gaze, including eye sequences, such as eye movements or eye fixations, is maintained over a time period. A user interface stimulus event associated with the user interface and occurring within the time period is detected. An association is determined between the eye sequence and the user interface stimulus event. An interaction observation that includes the eye sequence and a location of the stimulus event is generated. Personal calibration parameters are updated based on the interaction observation.
    Type: Grant
    Filed: May 16, 2018
    Date of Patent: January 7, 2020
    Assignee: Tobii AB
    Inventors: Alexander Davies, Maria Gordon, Per-Edvin Stoltz
  • Publication number: 20200004331
    Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A computer can enlarge a portion of a display adjacent a first gaze target in response to detecting a first action (e.g., pressing a touchpad). The computer can then allow a user to position a second gaze target in the enlarged portion (e.g., by looking at the desired location) and perform a second action in order to perform a computer function at that location. The enlarging can allow a user to identify a desired location for a computer function (e.g., selecting an icon) with greater precision.
    Type: Application
    Filed: September 9, 2019
    Publication date: January 2, 2020
    Applicant: Tobii AB
    Inventors: Erland George-Svahn, Mårten Skogö
  • Publication number: 20190384941
    Abstract: Computer display privacy and security for computer systems. In one aspect, the invention provides a computer-controlled system for regulating the interaction between a computer and a user of the computer based on the environment of the computer and the user. For example, the computer-controlled system provided by the invention comprises an input-output device including an image sensor configured to collect facial recognition data proximate to the computer. The system also includes a user security parameter database encoding security parameters associated with the user; the database is also configured to communicate with the security processor. The security processor is configured to receive the facial recognition data and the security parameters associated with the user, and is further configured to at least partially control the operation of the data input device and the data output device in response to the facial recognition data and the security parameters associated with the user.
    Type: Application
    Filed: March 19, 2019
    Publication date: December 19, 2019
    Applicant: Tobii AB
    Inventors: William R. Anderson, Steven E. Turner, Steven Pujia