Patents Assigned to TOBII AB
-
Publication number: 20210073421Abstract: Computer display privacy and security for computer systems. In one aspect, the invention provides a computer-controlled system for regulating the interaction between a computer and a user of the computer based on the environment of the computer and the user. For example, the computer-controlled system provided by the invention comprises an input-output device including an image sensor configured to collect facial recognition data proximate to the computer. The system also includes a user security parameter database encoding security parameters associated with the user; the database is also configured to communicate with the security processor. The security processor is configured to receive the facial recognition data and the security parameters associated with the user, and is further configured to at least partially control the operation of the data input device and the data output device in response to the facial recognition data and the security parameters associated with the user.Type: ApplicationFiled: August 28, 2020Publication date: March 11, 2021Applicant: Tobii ABInventors: William R. Anderson, Steven E. Turner, Steven Pujia
-
Publication number: 20210076026Abstract: In one embodiment, a wearable apparatus is provided which may include a holographic film disposed on the wearable apparatus, an eye tracking device including an image sensor and an illuminator, and one or more processors. The processors may be configured to activate the illuminator to illuminate the holographic film and capture, with the image sensor, an image of at least a portion of the holographic film while the holographic film is illuminated. The processors may also be configured to determine a characteristic of the holographic film based on the image and determine, based on the characteristic, a location or an orientation of the image sensor relative to the holographic film. The processors may further be configured to change, based on the location or the orientation of the image sensor relative to the holographic film, at least one calibration parameter of the image sensor.Type: ApplicationFiled: June 30, 2020Publication date: March 11, 2021Applicant: Tobii ABInventors: Daniel Tornéus, Torbjörn Sundberg, Magnus Arvidsson
-
Publication number: 20210064856Abstract: A method for detecting the physical presence of a person at an expected person location, the method comprising the steps: a) providing a sensor device comprising one or several sensors, the sensor device being arranged to perform at least one high-power type measurement and at least one low-power type measurement, wherein the sensor device comprises at least one image sensor arranged to depict the person by a measurement of said high-power type, and wherein each of said low-power type measurements over time requires less electric power for operation as compared to said high-power type measurement; b) the sensor device detecting said potential presence of the person using at least one of said low-power type measurements; c) using one of said high-power type measurements, the sensor device producing an image depicting the person and detecting the presence of the person based on image analysis of said image; d) using at least one of said low-power type measurements, the sensor device detecting a maintained prType: ApplicationFiled: September 2, 2020Publication date: March 4, 2021Applicant: Tobii ABInventors: Dimitrios Koufos, Erland George-Svahn, Magnus Ivarsson
-
Patent number: 10936063Abstract: Disclosed is a method for calibrating an eye-tracking device to suit a user of the eye-tracking device, wherein a calibration setting of the eye-tracking device—associated with a user is calculated based on acquired eye data of the user when looking at a set of reference points. The method comprises displaying a reference point of the set to the user; acquiring, by means of at least one camera of the eye-tracking device, eye data for at least one of the eyes of the user when looking at the reference point; comparing the acquired eye data to stored eye data sets related to the reference point, wherein each of the stored eye data sets is associated with a calibration setting of the eye-tracking device; and if the acquired eye data matches one of the stored eye data sets, abandoning the calibration process and loading the calibration setting associated with the matching stored eye data.Type: GrantFiled: November 12, 2019Date of Patent: March 2, 2021Assignee: Tobii ABInventors: Robin Thunström, Tobias Höglund
-
Patent number: 10928895Abstract: Techniques for interacting with a first computing device based on gaze information are described. In an example, the first computing device captures a gaze direction of a first user of the first computing device by using an eye tracking device. The first computing device displays a representation of a second user on a display of the first computing device. Further, the first computing device receives from the first user, communication data generated by an input device. The first computing device determines if the gaze direction of the first user is directed to the representation of the second user. If the gaze direction of the first user is directed to the representation of the second user, the first computing device transmits the communication data to a second computing device of the second user.Type: GrantFiled: September 19, 2018Date of Patent: February 23, 2021Assignee: Tobii ABInventors: Daniel Ricknäs, Erland George-Svahn, Rebecka Lannsjö, Andrew Ratcliff, Regimantas Vegele, Geoffrey Cooper, Niklas Blomqvist
-
Patent number: 10928891Abstract: The present disclosure relates to a method for calibrating a camera of a head-mounted display, HMD. The method comprises providing a calibration target in front of a lens of the HMD. Each of the calibration target and the lens basically extend in a corresponding two-dimensional plane. The method further comprises determining a lateral position of the calibration target. The lateral position relates to a position of the calibration target in the two-dimensional plane. The method even further comprises determining a lateral position of the lens. The lateral position relates to a position of the lens in the two-dimensional plane. The method yet even further comprises determining a calibration target misalignment based on the determined lateral position of the calibration target and based on the determined lateral position of the lens. The method also comprises performing a hardware calibration of the HMD. The hardware calibration is adapted for the calibration target misalignment.Type: GrantFiled: May 29, 2019Date of Patent: February 23, 2021Assignee: Tobii ABInventor: Mikael Rosell
-
Patent number: 10928897Abstract: According to the invention, a method for changing a display based on a gaze point of a user on the display is disclosed. The method may include determining a gaze point of a user on a display. The method may also include causing a first area of the display to be displayed in a first manner, the first area including the gaze point and a surrounding area. The method may further include causing a second area of the display to be displayed in a second manner, the second area being different than the first area, and the second manner being different than the first manner.Type: GrantFiled: November 13, 2018Date of Patent: February 23, 2021Assignee: Tobii ABInventor: Andreas Klingström
-
Publication number: 20210042520Abstract: There is disclosed a computer implemented eye tracking system and corresponding method and computer readable storage medium, for detecting three dimensional, 3D, gaze, by obtaining at least one head pose parameter using a head pose prediction algorithm, the head pose parameter(s) comprising one or more of a head position, pitch, yaw, or roll; and to input the at least one head pose parameter along with at least one image of a user's eye, generated from a 2D image captured using an image sensor associated with the eye tracking system, into a neural network configured to generate 3D gaze information based on the at least one head pose parameter and the at least one eye image.Type: ApplicationFiled: June 15, 2020Publication date: February 11, 2021Applicant: Tobii ABInventors: David Molin, Tommaso Martini, Maria Gordon, Alexander Davies, Oscar Danielsson
-
Publication number: 20210038072Abstract: A portable eye tracker device is disclosed which includes a frame, at least one optics holding member, and a control unit. The frame may be adapted for wearing by a user. The at least one optics holding member may include at least one illuminator configured to selectively illuminate at least a portion of at least one eye of the user, and at least one image sensor configured to capture image data representing images of at least a portion of at least one eye of the user. The control unit may be configured to control the at least one illuminator for the selective illumination of at least a portion of at least one eye of the user, receive the image data from the at least one image sensor, and calibrate at least one illuminator, at least one image sensor, or an algorithm of the control unit.Type: ApplicationFiled: July 30, 2020Publication date: February 11, 2021Applicant: Tobii ABInventors: Simon Gusstafsson, Anders Kingbäck, Markus Cederlund
-
Publication number: 20210042015Abstract: Disclosed is a method for interacting with a selectable object displayed by means of a displaying device, the method comprising the steps of: obtaining a gaze convergence distance and a gaze direction of a user, the gaze direction lying in a field of view defined by the displaying device; determining whether the gaze direction coincides with the selectable object; and if so, detecting a change in the gaze convergence distance; and if the detected change in the gaze convergence distance exceeds a predetermined threshold value interacting with the selectable object.Type: ApplicationFiled: May 15, 2020Publication date: February 11, 2021Applicant: Tobii ABInventor: Andrew Ratcliff
-
Publication number: 20210041945Abstract: An eye tracking system, a head mounted device, a computer program, a carrier and a method in an eye tracking system for determining a refined gaze point of a user are disclosed. In the method a gaze convergence distance of the user is determined. Furthermore, a spatial representation of at least a part of a field of view of the user is obtained and depth data for at least a part of the spatial representation are obtained. Saliency data for the spatial representation are determined based on the determined gaze convergence distance and the obtained depth data, and a refined gaze point of the user is determined based on the determined saliency data.Type: ApplicationFiled: June 19, 2020Publication date: February 11, 2021Applicant: Tobii ABInventor: Geoffrey Cooper
-
Publication number: 20210034152Abstract: A method and a corresponding apparatus for mitigating motion sickness in a virtual reality VR/augmented reality (AR) system using a head mounted display (HMD) are disclosed. The method comprises receiving data from a sensor indicating a current orientation of the HMD in real space, and superimposing a visual indication on a display of the HMD. The visual indication provides visual information to a user of the current orientation of the HMD in real space. Furthermore, methods and corresponding apparatuses are disclosed of calculating gaze convergence distance in an eye tracking system, and of gaze based virtual reality (VR)/augmented reality (AR) menu expansion.Type: ApplicationFiled: October 19, 2020Publication date: February 4, 2021Applicant: Tobii ABInventors: Andreas Klingström, Per Fogelström, Andrew Ratcliff
-
Patent number: 10895908Abstract: According to the invention, techniques for refining a ballistic prediction are described. In an example, an eye tracking system may record images over time of content presented on a display. Saccade data may be received and used as a trigger to retrieve particular ones of the recoded images. The eye tracking system may compare the images to identify a change in the content. The location of this change may correspond to a sub-area of the display. The output of the ballistic prediction may include a landing point that represents an anticipated gaze point. This landing point may be adjusted such that a gaze point is now predicted to fall within the sub-area when the change is significant.Type: GrantFiled: November 26, 2018Date of Patent: January 19, 2021Assignee: Tobii ABInventor: Daan Nijs
-
Patent number: 10895909Abstract: According to the invention, a system for presenting graphics on a display device is disclosed. The system may include an eye tracking device configured to determine a gaze point of a user on a display device. The system may also include a processing device configured to combine a quality map generated based on the gaze point of the user and another quality map generated based on factors independent of the gaze point of the user to generate a combined quality map. The processing device is further configured to cause a rendered image to be displayed on the display device and the quality of each area of the rendered image displayed on the display device is determined based, at least in part, upon the combined quality map.Type: GrantFiled: December 3, 2018Date of Patent: January 19, 2021Assignee: Tobii ABInventors: Daan Nijs, Robin Thunström
-
Publication number: 20210012559Abstract: A method for determining a focus target of a user's gaze in a three-dimensional (“3D”) scene is disclosed. The method may include determining a first gaze direction of a user into a 3D scene, where the 3D scene includes a plurality of components. The method may also include executing a first plurality of line traces in the 3D scene, where each of the first plurality of line traces is in proximity to the first gaze direction. The method may further include determining a confidence value for each component intersected by at least one of the first plurality of line traces. The method may additionally include identifying as a focus target of the user the component having the highest confidence value of all components intersected by at least one of the first plurality of line traces.Type: ApplicationFiled: February 20, 2020Publication date: January 14, 2021Applicant: Tobii ABInventor: Fredrik Lindh
-
Publication number: 20210011551Abstract: An eyetracker obtains a digital image representing at least one eye of a subject. The eyetracker then searches for pupil candidates in the digital image according to a search algorithm and, based on the searching, determines a position for the at least one eye in the digital image. The eyetracker also obtains light-intensity information expressing an estimated amount of light energy exposing the at least one eye when registering the digital image. In response to the light-intensity information, the eye-tracker determines a range of pupil sizes. The search algorithm applies the range of pupil sizes in such a manner that a detected pupil candidate must have size within the range of pupil sizes to be accepted by the search algorithm as a valid pupil of the subject.Type: ApplicationFiled: June 26, 2020Publication date: January 14, 2021Applicant: Tobii ABInventor: Richard Andersson
-
Publication number: 20210011682Abstract: According to the invention, a method for providing audio to a user is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user on a display. The method may also include causing, with a computer system, an audio device to produce audio to the user, where content of the audio may be based at least in part on the gaze point of the user on the display.Type: ApplicationFiled: April 28, 2020Publication date: January 14, 2021Applicant: Tobii ABInventors: Anders Vennström, Fredrik Lindh
-
Publication number: 20210011550Abstract: The disclosure relates to a method performed by a computer for identifying a space that a user of a gaze tracking system is viewing, the method comprising obtaining gaze tracking sensor data, generating gaze data comprising a probability distribution using the sensor data by processing the sensor data by a trained model and identifying a space that the user is viewing using the probability distribution.Type: ApplicationFiled: June 15, 2020Publication date: January 14, 2021Applicant: Tobii ABInventors: Patrik Barkman, Anders Dahl, Oscar Danielsson, Tommaso Martini, Mårten Nilsson
-
Publication number: 20210014442Abstract: Techniques for reducing a read out time and power consumption of an image sensor used for eye tracking are described. In an example, a position of an eye element in an active area of a sensor is determined. The eye element can be any of an eye, a pupil of the eye, an iris of the eye, or a glint at the eye. A region of interest (ROI) around the position of the eye is defined. The image sensor reads out pixels confined to the ROI, thereby generating an ROI image that shows the eye element.Type: ApplicationFiled: February 20, 2020Publication date: January 14, 2021Applicant: Tobii ABInventors: Magnus Ivarsson, Per-Edvin Stoltz, David Masko, Niklas Ollesson, Mårten Skogö, Peter Blixt, Henrik Jönsson
-
Publication number: 20210012161Abstract: Techniques for generating 3D gaze predictions based on a deep learning system are described. In an example, the deep learning system includes a neural network. The neural network is trained with training images generated by cameras and showing eyes of user while gazing at stimulus points. Some of the stimulus points are in the planes of the camera. Remaining stimulus points are not un the planes of the cameras. The training includes inputting a first training image associated with a stimulus point in a camera plane and inputting a second training image associated with a stimulus point outside the camera plane. The training minimizes a loss function of the neural network based on a distance between at least one of the stimulus points and a gaze line.Type: ApplicationFiled: June 2, 2020Publication date: January 14, 2021Applicant: Tobii ABInventor: Erik Linden