Patents Assigned to TOBII AB
-
Patent number: 10572008Abstract: At least one image registering unit records at least one series of images representing a subject. A control unit controls an operation sequence for the at least one image registering unit in such a manner that a subsequent data processing unit receives a repeating sequence of image frames there from, wherein each period contains at least one image frame of a first resolution and at least one image frame of a second resolution being different from the first resolution. Based on the registered image frames, the data processing unit produces eye/gaze tracking data with respect to the subject.Type: GrantFiled: December 20, 2017Date of Patent: February 25, 2020Assignee: Tobii ABInventors: Mattias Kuldkepp, Marten Skogo, Mattias Hanqvist, Martin Brogren, Dineshkumar Muthusamy
-
Patent number: 10565446Abstract: A system for determining a gaze direction of a user of a wearable device is disclosed. The system may include a primary lens, an illuminator, an image sensor, and an interface. The illuminator may include a light guide, may be disposed at least partially on or in the primary lens, and may be configured to illuminate at least one eye of a user. The image sensor may be disposed on or in the primary lens, and may be configured to detect light reflected by the at least one eye of the user. The interface may be configured to provide data from the image sensor to a processor for determining a gaze direction of the user based at least in part on light detected by the image sensor.Type: GrantFiled: July 17, 2017Date of Patent: February 18, 2020Assignee: Tobii ABInventors: Simon Gustafsson, Anders Kingbäck, Peter Blixt, Richard Hainzl, Mårten Skogö
-
Publication number: 20200050267Abstract: The present invention relates to control of a computer system, which includes a data processing unit, a display and an eye tracker adapted to register a user's gaze point with respect to the display. The data processing unit is adapted to present graphical information on the display, which includes feedback data reflecting the user's commands entered into the unit. The data processing unit is adapted to present the feedback data such that during an initial phase, the feedback data is generated based on an absolute position of the gaze point. An imaging device of the system is also adapted to register image data representing movements of a body part of the user and to forward a representation of the image data to the data processing unit. Hence, during a phase subsequent to the initial phase, the data is instead generated based on the image data.Type: ApplicationFiled: October 22, 2019Publication date: February 13, 2020Applicant: Tobii ABInventors: John Elvesjö, Anders Olsson, Johan Sahlén
-
Patent number: 10558895Abstract: Techniques for generating 3D gaze predictions based on a deep learning system are described. In an example, the deep learning system includes a neural network. A scaled image is generated from 2D image showing a user face based on a rough distance between the user eyes and a camera that generated the 2D image. Image crops at different resolutions are generated from the scaled image and include a crop around each of the user eyes and a crop around the user face. These crops are input to the neural network. In response, the neural network outputs a distance correction and a 2D gaze vector per user eye. A corrected eye-to-camera distance is generated by correcting the rough distance based on the distance correction. A 3D gaze vector for each of the user eyes is generated based on the corresponding 2D gaze vector and the corrected distance.Type: GrantFiled: March 30, 2018Date of Patent: February 11, 2020Assignee: Tobii ABInventor: Erik Linden
-
Patent number: 10558262Abstract: According to the invention, a method for changing a display based at least in part on a gaze point of a user on the display is disclosed. The method may include receiving information identifying a location of the gaze point of the user on the display. The method may also include, based at least in part on the location of the gaze point, causing a virtual camera perspective to change, thereby causing content on the display associated with the virtual camera to change.Type: GrantFiled: January 20, 2015Date of Patent: February 11, 2020Assignee: Tobii ABInventor: Rebecka Lannsjö
-
Patent number: 10551915Abstract: According to the invention, a method for entering text into a computing device using gaze input from a user is disclosed. The method may include causing a display device to display a visual representation of a plurality of letters. The method may also include receiving gaze information identifying a movement of the user's gaze on the visual representation. The method may further include recording an observation sequence of one or more observation events that occur during the movement of the user's gaze on the visual representation. The method may additionally include providing the observation sequence to a decoder module. The decoder module may determine at least one word from the observation sequence representing an estimate of an intended text of the user.Type: GrantFiled: August 22, 2018Date of Patent: February 4, 2020Assignee: Tobii ABInventors: Per Ola Kristensson, Keith Vertanen, Morten Mjelde
-
Patent number: 10545574Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A visual indicator can be presented on a display to indicate the location where a computer function will take place (e.g., a common cursor). The visual indicator can be moved to a gaze target in response to continued detection of an action (e.g., touchpad touch) by a user for a predetermined period of time. A delay between the action and the movement of the visual indicator can allow a user time to “abort” movement of the visual indicator. Additionally, once the visual indicator has moved, the visual indicator can be controlled with additional precision as the user moves the gaze while continuing the action (e.g., continued holding of the touchpad).Type: GrantFiled: March 3, 2017Date of Patent: January 28, 2020Assignee: Tobii ABInventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
-
Publication number: 20200026068Abstract: A method for determining eye openness with an eye tracking device is disclosed. The method may include determining, for pixels of an image sensor of an eye tracking device, during a first time period when an eye of a user is open, a first sum of intensity of the pixels. The method may also include determining, during a second time period when the eye of the user is closed, a second sum of intensity of the pixels. The method may further include determining, during a third time period, a third sum of intensity of the pixels. The method may additionally include determining that upon the third sum exceeding a fourth sum of the first sum plus a threshold amount, that the eye of the user is closed, the threshold amount is equal to a product of a threshold fraction and a difference between the first sum and the second sum.Type: ApplicationFiled: June 24, 2019Publication date: January 23, 2020Applicant: Tobii ABInventors: Mark Ryan, Torbjörn Sundberg, Pravin Rana, Yimu Wang
-
Patent number: 10540008Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.Type: GrantFiled: March 31, 2016Date of Patent: January 21, 2020Assignee: Tobii ABInventors: Andreas Klingström, Mårten Skogö, Richard Hainzl, John Elvesjö
-
Publication number: 20200019239Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: ApplicationFiled: June 25, 2019Publication date: January 16, 2020Applicant: Tobii ABInventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Patent number: 10534982Abstract: Techniques for generating 3D gaze predictions based on a deep learning system are described. In an example, the deep learning system includes a neural network. The neural network is trained with training images. During the training, calibration parameters are initialized and input to the neural network, and are updated through the training. Accordingly, the network parameters of the neural network are updated based in part on the calibration parameters. Upon completion of the training, the neural network is calibrated for a user. This calibration includes initializing and inputting the calibration parameters along with calibration images showing an eye of the user to the neural network. The calibration includes updating the calibration parameters without changing the network parameters by minimizing the loss function of the neural network based on the calibration images. Upon completion of the calibration, the neural network is used to generate 3D gaze information for the user.Type: GrantFiled: March 30, 2018Date of Patent: January 14, 2020Assignee: Tobii ABInventor: Erik Linden
-
Patent number: 10534526Abstract: Disclosed are various embodiments for automatic scrolling of content displayed on a display device in response to gaze detection. Content may be displayed in a window rendered on a display screen. Gaze detection components may be used to detect that a user is gazing at the displayed content and to determine a gaze point relative to the display screen. At least one applicable scroll zone relative to the display screen and a scroll action associated with each applicable scroll zone may be determined. In response to determining that the gaze point is within a first applicable scroll zone, an associated first scroll action may be initiated. The first scroll action causes the content to scroll within the window until at least one of: expiration of a defined period, determining that a portion of the content scrolls past a defined position within the window, determining that the gaze point is outside of the first scroll zone, and detecting an indicator that the user begins reading the content.Type: GrantFiled: December 21, 2017Date of Patent: January 14, 2020Assignee: Tobii ABInventors: Anders Olsson, Mårten Skogö
-
Patent number: 10528131Abstract: A system and techniques for calibrating an eye tracking system are described. The system can update the calibration of personal calibration parameters continuously based on a user's gaze on a user interface, following user interface stimulus events. The system improves continuous calibration techniques by determining an association between the user's eye sequences and the stimulus events, and updates the personal calibration parameters accordingly. A record indicative of a user gaze, including eye sequences, such as eye movements or eye fixations, is maintained over a time period. A user interface stimulus event associated with the user interface and occurring within the time period is detected. An association is determined between the eye sequence and the user interface stimulus event. An interaction observation that includes the eye sequence and a location of the stimulus event is generated. Personal calibration parameters are updated based on the interaction observation.Type: GrantFiled: May 16, 2018Date of Patent: January 7, 2020Assignee: Tobii ABInventors: Alexander Davies, Maria Gordon, Per-Edvin Stoltz
-
Publication number: 20200004331Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A computer can enlarge a portion of a display adjacent a first gaze target in response to detecting a first action (e.g., pressing a touchpad). The computer can then allow a user to position a second gaze target in the enlarged portion (e.g., by looking at the desired location) and perform a second action in order to perform a computer function at that location. The enlarging can allow a user to identify a desired location for a computer function (e.g., selecting an icon) with greater precision.Type: ApplicationFiled: September 9, 2019Publication date: January 2, 2020Applicant: Tobii ABInventors: Erland George-Svahn, Mårten Skogö
-
Publication number: 20190384941Abstract: Computer display privacy and security for computer systems. In one aspect, the invention provides a computer-controlled system for regulating the interaction between a computer and a user of the computer based on the environment of the computer and the user. For example, the computer-controlled system provided by the invention comprises an input-output device including an image sensor configured to collect facial recognition data proximate to the computer. The system also includes a user security parameter database encoding security parameters associated with the user; the database is also configured to communicate with the security processor. The security processor is configured to receive the facial recognition data and the security parameters associated with the user, and is further configured to at least partially control the operation of the data input device and the data output device in response to the facial recognition data and the security parameters associated with the user.Type: ApplicationFiled: March 19, 2019Publication date: December 19, 2019Applicant: Tobii ABInventors: William R. Anderson, Steven E. Turner, Steven Pujia
-
Publication number: 20190369720Abstract: A method of identifying scleral reflections in an eye tracking system and a corresponding eye tracking system are disclosed. An image of an eye of a user from an image sensor is received, the image resulting from the image sensor detecting light from one or more illuminators reflected from the eye of the user. A glint is identified in the image, wherein the glint is a representation in the image of a reflection of light from a cornea of the eye of the user or from a sclera of the eye of the user. A first pixel intensity of the glint is determined, a second pixel intensity of neighbor pixels of the glint is determined, and an absolute value of the difference between the first pixel intensity of the glint and the second pixel intensity of the neighbor pixels of the glint is determined. The glint is identified as a representation of a reflection from the sclera of the eye of the user on condition that the determined absolute value of the difference is below a predetermine threshold value.Type: ApplicationFiled: May 23, 2019Publication date: December 5, 2019Applicant: Tobii ABInventors: Pravin Kumar Rana, Yimu Wang
-
Publication number: 20190369719Abstract: A method and a corresponding eye tracking system for providing an approximate gaze convergence distance of a user in an eye tracking system are disclosed. The method comprises determining calibration data in relation to interpupillary distance between a pupil of a left eye and a pupil of a right eye of a user, determining, based on the determined calibration data, a gaze convergence function providing an approximate gaze convergence distance of the user based on a determined interpupillary distance of the user. The method further comprises receiving, from one or more imaging devices, one or more images of the left eye and the right eye of the user, determining a current interpupillary distance of the user based on the one or more images and determining a current approximate gaze convergence distance based on the current interpupillary distance and the gaze convergence function.Type: ApplicationFiled: December 14, 2018Publication date: December 5, 2019Applicant: Tobii ABInventors: Andreas Klingström, Per Fogelström
-
Patent number: 10488919Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.Type: GrantFiled: December 31, 2015Date of Patent: November 26, 2019Assignee: Tobii ABInventors: Erland George-Svahn, Dzenan Dzemidzic, Anders Vennström, Ida Nilsson, Anders Olsson, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20190354172Abstract: A system and techniques for calibrating an eye tracking system are described. The system can update the calibration of personal calibration parameters continuously based on a user's gaze on a user interface, following user interface stimulus events. The system improves continuous calibration techniques by determining an association between the user's eye sequences and the stimulus events, and updates the personal calibration parameters accordingly. A record indicative of a user gaze, including eye sequences, such as eye movements or eye fixations, is maintained over a time period. A user interface stimulus event associated with the user interface and occurring within the time period is detected. An association is determined between the eye sequence and the user interface stimulus event. An interaction observation that includes the eye sequence and a location of the stimulus event is generated. Personal calibration parameters are updated based on the interaction observation.Type: ApplicationFiled: May 16, 2018Publication date: November 21, 2019Applicant: Tobii ABInventors: Alexander Davies, Maria Gordon, Per-Edvin Stoltz
-
Publication number: 20190346919Abstract: According to the invention, a method for changing a display based on a gaze point of a user on the display is disclosed. The method may include determining a gaze point of a user on a display. The method may also include causing a first area of the display to be displayed in a first manner, the first area including the gaze point and a surrounding area. The method may further include causing a second area of the display to be displayed in a second manner, the second area being different than the first area, and the second manner being different than the first manner.Type: ApplicationFiled: November 13, 2018Publication date: November 14, 2019Applicant: Tobii ABInventor: Andreas Klingström