Patents Assigned to TOBII AB
-
Patent number: 10591990Abstract: A gaze tracking system, leaving a low power mode in response to an activation signal, initial burst of eye pictures in short time by restricting the image area of a sensor, purpose of enabling an increased frame rate. Subsequent eye pictures are captured at le rate. The first gaze point value is computed memorylessly based on the initial burst res and no additional imagery, while subsequent values may be computed recursively to account previous gaze point values or information from previous eye pictures. The restriction of the image area may be guided by a preliminary overview picture captured using the different sensor. From the gaze point values, the system may derive a control signal to a computer device with a visual display.Type: GrantFiled: December 26, 2018Date of Patent: March 17, 2020Assignee: Tobii ABInventors: Mårten Skogö, Anders Olsson, John Mikael Elvesjö, Aron Yu
-
Patent number: 10594974Abstract: Techniques for reducing a read out time and power consumption of an image sensor used for eye tracking are described. In an example, a position of an eye element in an active area of a sensor is determined. The eye element can be any of an eye, a pupil of the eye, an iris of the eye, or a glint at the eye. A region of interest (ROI) around the position of the eye is defined. The image sensor reads out pixels confined to the ROI, thereby generating an ROI image that shows the eye element.Type: GrantFiled: May 10, 2018Date of Patent: March 17, 2020Assignee: Tobii ABInventors: Magnus Ivarsson, Per-Edvin Stoltz, David Masko, Niklas Ollesson, Mårten Skogö, Peter Blixt, Henrik Jönsson
-
Patent number: 10585277Abstract: According to the invention, a system for tracking a gaze of a user across a multi-display arrangement is disclosed. The system may include a first display, a first eye tracking device, a second display, a second eye tracking device, and a processor. The first eye tracking device may be configured to determine a user's gaze direction while the user is gazing at the first display. The second eye tracking device may be configured to determine the user's gaze direction while the user is gazing at the second display. The processor may be configured to determine that the user's gaze has moved away from the first display in a direction of the second display, and in response to determining that the user's gaze has moved away from the first display in the direction of the second display, deactivate the first eye tracking device, and activate the second eye tracking device.Type: GrantFiled: August 31, 2017Date of Patent: March 10, 2020Assignee: Tobii ABInventors: Farshid Bagherpour, Mårten Skogö
-
Publication number: 20200076998Abstract: A portable eye tracker device is disclosed which includes a frame, at least one optics holding member, and a control unit. The frame may be adapted for wearing by a user. The at least one optics holding member may include at least one illuminator configured to selectively illuminate at least a portion of at least one eye of the user, and at least one image sensor configured to capture image data representing images of at least a portion of at least one eye of the user. The control unit may be configured to control the at least one illuminator for the selective illumination of at least a portion of at least one eye of the user, receive the image data from the at least one image sensor, and calibrate at least one illuminator, at least one image sensor, or an algorithm of the control unit.Type: ApplicationFiled: June 19, 2019Publication date: March 5, 2020Applicant: Tobii ABInventors: Simon Gusstafsson, Anders Kingbäck, Markus Cederlund
-
Patent number: 10579142Abstract: A method for determining if a user's gaze is directed in the direction of a zone of interest in a 3D scene comprises: providing a 3D scene containing a zone of interest; associating a property with the zone of interest; creating a bitmap representing the location of the zone of interest in a projected view of the 3D scene, each pixel of the bitmap to which the zone of interest is projected storing the property of the zone of interest; detecting the direction of the user's gaze; using the bitmap to determine if the detected user's gaze is directed in the direction of the zone of interest.Type: GrantFiled: January 24, 2019Date of Patent: March 3, 2020Assignee: Tobii ABInventors: Fredrik Lindh, Mattias Gustavsson, Anders Vennstrom, Andreas Edling
-
Patent number: 10572008Abstract: At least one image registering unit records at least one series of images representing a subject. A control unit controls an operation sequence for the at least one image registering unit in such a manner that a subsequent data processing unit receives a repeating sequence of image frames there from, wherein each period contains at least one image frame of a first resolution and at least one image frame of a second resolution being different from the first resolution. Based on the registered image frames, the data processing unit produces eye/gaze tracking data with respect to the subject.Type: GrantFiled: December 20, 2017Date of Patent: February 25, 2020Assignee: Tobii ABInventors: Mattias Kuldkepp, Marten Skogo, Mattias Hanqvist, Martin Brogren, Dineshkumar Muthusamy
-
Patent number: 10565446Abstract: A system for determining a gaze direction of a user of a wearable device is disclosed. The system may include a primary lens, an illuminator, an image sensor, and an interface. The illuminator may include a light guide, may be disposed at least partially on or in the primary lens, and may be configured to illuminate at least one eye of a user. The image sensor may be disposed on or in the primary lens, and may be configured to detect light reflected by the at least one eye of the user. The interface may be configured to provide data from the image sensor to a processor for determining a gaze direction of the user based at least in part on light detected by the image sensor.Type: GrantFiled: July 17, 2017Date of Patent: February 18, 2020Assignee: Tobii ABInventors: Simon Gustafsson, Anders Kingbäck, Peter Blixt, Richard Hainzl, Mårten Skogö
-
Publication number: 20200050267Abstract: The present invention relates to control of a computer system, which includes a data processing unit, a display and an eye tracker adapted to register a user's gaze point with respect to the display. The data processing unit is adapted to present graphical information on the display, which includes feedback data reflecting the user's commands entered into the unit. The data processing unit is adapted to present the feedback data such that during an initial phase, the feedback data is generated based on an absolute position of the gaze point. An imaging device of the system is also adapted to register image data representing movements of a body part of the user and to forward a representation of the image data to the data processing unit. Hence, during a phase subsequent to the initial phase, the data is instead generated based on the image data.Type: ApplicationFiled: October 22, 2019Publication date: February 13, 2020Applicant: Tobii ABInventors: John Elvesjö, Anders Olsson, Johan Sahlén
-
Patent number: 10558262Abstract: According to the invention, a method for changing a display based at least in part on a gaze point of a user on the display is disclosed. The method may include receiving information identifying a location of the gaze point of the user on the display. The method may also include, based at least in part on the location of the gaze point, causing a virtual camera perspective to change, thereby causing content on the display associated with the virtual camera to change.Type: GrantFiled: January 20, 2015Date of Patent: February 11, 2020Assignee: Tobii ABInventor: Rebecka Lannsjö
-
Patent number: 10558895Abstract: Techniques for generating 3D gaze predictions based on a deep learning system are described. In an example, the deep learning system includes a neural network. A scaled image is generated from 2D image showing a user face based on a rough distance between the user eyes and a camera that generated the 2D image. Image crops at different resolutions are generated from the scaled image and include a crop around each of the user eyes and a crop around the user face. These crops are input to the neural network. In response, the neural network outputs a distance correction and a 2D gaze vector per user eye. A corrected eye-to-camera distance is generated by correcting the rough distance based on the distance correction. A 3D gaze vector for each of the user eyes is generated based on the corresponding 2D gaze vector and the corrected distance.Type: GrantFiled: March 30, 2018Date of Patent: February 11, 2020Assignee: Tobii ABInventor: Erik Linden
-
Patent number: 10551915Abstract: According to the invention, a method for entering text into a computing device using gaze input from a user is disclosed. The method may include causing a display device to display a visual representation of a plurality of letters. The method may also include receiving gaze information identifying a movement of the user's gaze on the visual representation. The method may further include recording an observation sequence of one or more observation events that occur during the movement of the user's gaze on the visual representation. The method may additionally include providing the observation sequence to a decoder module. The decoder module may determine at least one word from the observation sequence representing an estimate of an intended text of the user.Type: GrantFiled: August 22, 2018Date of Patent: February 4, 2020Assignee: Tobii ABInventors: Per Ola Kristensson, Keith Vertanen, Morten Mjelde
-
Patent number: 10545574Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A visual indicator can be presented on a display to indicate the location where a computer function will take place (e.g., a common cursor). The visual indicator can be moved to a gaze target in response to continued detection of an action (e.g., touchpad touch) by a user for a predetermined period of time. A delay between the action and the movement of the visual indicator can allow a user time to “abort” movement of the visual indicator. Additionally, once the visual indicator has moved, the visual indicator can be controlled with additional precision as the user moves the gaze while continuing the action (e.g., continued holding of the touchpad).Type: GrantFiled: March 3, 2017Date of Patent: January 28, 2020Assignee: Tobii ABInventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
-
Publication number: 20200026068Abstract: A method for determining eye openness with an eye tracking device is disclosed. The method may include determining, for pixels of an image sensor of an eye tracking device, during a first time period when an eye of a user is open, a first sum of intensity of the pixels. The method may also include determining, during a second time period when the eye of the user is closed, a second sum of intensity of the pixels. The method may further include determining, during a third time period, a third sum of intensity of the pixels. The method may additionally include determining that upon the third sum exceeding a fourth sum of the first sum plus a threshold amount, that the eye of the user is closed, the threshold amount is equal to a product of a threshold fraction and a difference between the first sum and the second sum.Type: ApplicationFiled: June 24, 2019Publication date: January 23, 2020Applicant: Tobii ABInventors: Mark Ryan, Torbjörn Sundberg, Pravin Rana, Yimu Wang
-
Patent number: 10540008Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.Type: GrantFiled: March 31, 2016Date of Patent: January 21, 2020Assignee: Tobii ABInventors: Andreas Klingström, Mårten Skogö, Richard Hainzl, John Elvesjö
-
Publication number: 20200019239Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: ApplicationFiled: June 25, 2019Publication date: January 16, 2020Applicant: Tobii ABInventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Patent number: 10534526Abstract: Disclosed are various embodiments for automatic scrolling of content displayed on a display device in response to gaze detection. Content may be displayed in a window rendered on a display screen. Gaze detection components may be used to detect that a user is gazing at the displayed content and to determine a gaze point relative to the display screen. At least one applicable scroll zone relative to the display screen and a scroll action associated with each applicable scroll zone may be determined. In response to determining that the gaze point is within a first applicable scroll zone, an associated first scroll action may be initiated. The first scroll action causes the content to scroll within the window until at least one of: expiration of a defined period, determining that a portion of the content scrolls past a defined position within the window, determining that the gaze point is outside of the first scroll zone, and detecting an indicator that the user begins reading the content.Type: GrantFiled: December 21, 2017Date of Patent: January 14, 2020Assignee: Tobii ABInventors: Anders Olsson, Mårten Skogö
-
Patent number: 10534982Abstract: Techniques for generating 3D gaze predictions based on a deep learning system are described. In an example, the deep learning system includes a neural network. The neural network is trained with training images. During the training, calibration parameters are initialized and input to the neural network, and are updated through the training. Accordingly, the network parameters of the neural network are updated based in part on the calibration parameters. Upon completion of the training, the neural network is calibrated for a user. This calibration includes initializing and inputting the calibration parameters along with calibration images showing an eye of the user to the neural network. The calibration includes updating the calibration parameters without changing the network parameters by minimizing the loss function of the neural network based on the calibration images. Upon completion of the calibration, the neural network is used to generate 3D gaze information for the user.Type: GrantFiled: March 30, 2018Date of Patent: January 14, 2020Assignee: Tobii ABInventor: Erik Linden
-
Patent number: 10528131Abstract: A system and techniques for calibrating an eye tracking system are described. The system can update the calibration of personal calibration parameters continuously based on a user's gaze on a user interface, following user interface stimulus events. The system improves continuous calibration techniques by determining an association between the user's eye sequences and the stimulus events, and updates the personal calibration parameters accordingly. A record indicative of a user gaze, including eye sequences, such as eye movements or eye fixations, is maintained over a time period. A user interface stimulus event associated with the user interface and occurring within the time period is detected. An association is determined between the eye sequence and the user interface stimulus event. An interaction observation that includes the eye sequence and a location of the stimulus event is generated. Personal calibration parameters are updated based on the interaction observation.Type: GrantFiled: May 16, 2018Date of Patent: January 7, 2020Assignee: Tobii ABInventors: Alexander Davies, Maria Gordon, Per-Edvin Stoltz
-
Publication number: 20200004331Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A computer can enlarge a portion of a display adjacent a first gaze target in response to detecting a first action (e.g., pressing a touchpad). The computer can then allow a user to position a second gaze target in the enlarged portion (e.g., by looking at the desired location) and perform a second action in order to perform a computer function at that location. The enlarging can allow a user to identify a desired location for a computer function (e.g., selecting an icon) with greater precision.Type: ApplicationFiled: September 9, 2019Publication date: January 2, 2020Applicant: Tobii ABInventors: Erland George-Svahn, Mårten Skogö
-
Publication number: 20190384941Abstract: Computer display privacy and security for computer systems. In one aspect, the invention provides a computer-controlled system for regulating the interaction between a computer and a user of the computer based on the environment of the computer and the user. For example, the computer-controlled system provided by the invention comprises an input-output device including an image sensor configured to collect facial recognition data proximate to the computer. The system also includes a user security parameter database encoding security parameters associated with the user; the database is also configured to communicate with the security processor. The security processor is configured to receive the facial recognition data and the security parameters associated with the user, and is further configured to at least partially control the operation of the data input device and the data output device in response to the facial recognition data and the security parameters associated with the user.Type: ApplicationFiled: March 19, 2019Publication date: December 19, 2019Applicant: Tobii ABInventors: William R. Anderson, Steven E. Turner, Steven Pujia