Patents Assigned to TOBII AB
-
Publication number: 20220207768Abstract: An eye tracking system configured to: receive a plurality of right-eye-images of a right eye of a user; receive a plurality of left-eye-images of a left eye of a user, each left-eye-image corresponding to a right-eye-image in the plurality of right-eye-images; detect a pupil and determine an associated pupil-signal, for each of the plurality of right-eye-images and each of the plurality of left-eye-images; calculate a right-eye-pupil-variation of the pupil-signals for the plurality of right-eye-images and a left-eye-pupil-variation of the pupil-signals for the plurality of left-eye-images; and determine a right-eye-weighting and a left-eye-weighting based on the right-eye-pupil-variation and the left-eye-pupil-variation.Type: ApplicationFiled: December 29, 2020Publication date: June 30, 2022Applicant: Tobii ABInventors: Mikael Rosell, Simon Johansson, Pravin Kumar Rana, Yimu Wang, Gilfredo Remon Salazar
-
Publication number: 20220197029Abstract: The invention is related to a head-mounted display system. comprising a display for displaying an image and a lens through which a user views the display. A sensor system detects an eye relief and outputs a signal to one or more processors that modify an output of the display in response to the eye relief signal. An equivalent method is also disclosed.Type: ApplicationFiled: December 22, 2021Publication date: June 23, 2022Applicant: Tobii ABInventor: Daniel Tornéus
-
Patent number: 11366329Abstract: Disclosed is a method for switching user input modality of a displaying device displaying an interactable region. The displaying device is in communication with a first input modality and a second input modality. The first input modality is an eye tracker configured to determine gaze data of a user and the second input modality is an input modality other than an eye tracker configured to determine a pointing ray. The first input modality is selected as the input modality of the displaying device. The method comprises determining whether the pointing ray of the second input modality is intersecting with the interactable region. The method further comprises, based on the determining switching the input modality of the displaying device to the second input modality when the pointing ray of the second input modality is intersecting with the interactable region.Type: GrantFiled: April 30, 2021Date of Patent: June 21, 2022Assignee: Tobii ABInventors: Niklas Blomqvist, Robin Thunström, Dennis Rådell, Ralf Biedert
-
Publication number: 20220179487Abstract: The invention relates to an eye tracking device comprising one or more illuminators, each illuminator comprising a light emitting side, and each illuminator being connected to a first circuitry carder, an imaging module connected to a second circuitry carrier wherein the image module comprises optical arrangements. The plurality of illuminators, the imaging module and the circuitry carriers are embedded without gaps in a first material. The invention further relates to methods for manufacturing an eye tracking device with over-molded components.Type: ApplicationFiled: February 22, 2022Publication date: June 9, 2022Applicant: Tobii ABInventors: Eli Lundberg, Richard Hainzl, Daniel Torneus
-
Publication number: 20220179207Abstract: The disclosure relates to an eye tracking device for tracking movements of an eye comprising, a viewing plane for displaying a projection of an image to the eye of a user, an image module placed on a same side of the viewing plane as the eye, at least one illuminator for illuminating the eye, a control unit adapted to receive an image captured by the image module, and calculate a viewing angle of the eye, a holographic optical element (HOE), wherein a HOE is placed between the eye and the viewing plane, wherein the image module is adapted to capture an image of the HOE, and wherein the HOE is adapted to direct at last a first portion of incident light reflected from the eye, in a first angle towards the image module, the first angle being different from an angle of the incidence of the incident lightType: ApplicationFiled: February 22, 2022Publication date: June 9, 2022Applicant: Tobii ABInventors: Daniel Torneus, Peter Schef, Magnus Arvidsson, Peter Blixt, Fredrik Mattinson
-
Publication number: 20220180532Abstract: An eye tracking system comprises at least one illuminator and at least one image sensor configured to produce an image of an eye of a user, the image including illuminator light reflected from the eye of a user. A Fresnel lens is positioned between the image sensor and the eye of the user, through which the image sensor views the eye. Processing circuitry receives an image from the image sensor, identifies glints in the image, assigns an angular position to each glint based on an angular relationship between each glint and a centre of the Fresnel lens and determines how many glints have the same angular position. Glints are classified as false glints if more than a predetermined number of glints have the same angular position.Type: ApplicationFiled: December 7, 2021Publication date: June 9, 2022Applicant: Tobii ABInventors: Joakim Zachrisson, Simon Johansson, Mikael Rosell
-
Patent number: 11353952Abstract: Techniques for controlling light sources used in eye tracking are described. In an example, an eye tracking system generates a first image and a second image showing at least a portion of the user eye illuminated by a predetermined set of illuminators of the eye tracking system. The eye tracking system determines a first position of a glint in the first image and a second position of the glint in the second image. Each of the first position and the second position is relative to a pupil edge. The eye tracking system predicts a third position of the glint relative to the pupil edge based on the first position and the second position. Further, the eye tracking system determines, from the predetermined set, an illuminator that corresponds to the glint and determines, based on the third position, whether to power off the illuminator to generate a third image of at least the portion of the user eye.Type: GrantFiled: November 26, 2018Date of Patent: June 7, 2022Assignee: Tobii ABInventors: Daniel Johansson Tornéus, Andreas Klingström, Martin Skärbäck
-
Publication number: 20220171203Abstract: The present invention relates to a lens for eye-tracking applications. The lens comprises a first protective layer, arranged to face towards the eye to be tracked when the lens is used for eye- tracking. It also comprises at least one light source, at least partly arranged in the first protective layer, arranged to emit a first light from the first protective layer in a direction towards the eye. Moreover, it comprises at least one image capturing device, at least partly arranged in the first protective layer, arranged to receive the first light within the first protective layer. The lens further comprises an absorptive layer, arranged on the far side of the first protective layer seen from the eye to be tracked when the lens is used for eye-tracking, adapted to be absorptive for wavelengths of the majority of the first light.Type: ApplicationFiled: February 16, 2022Publication date: June 2, 2022Applicant: Tobii ABInventors: Axel Tollin, Daniel Ljunggren
-
Patent number: 11344196Abstract: A portable eye tracker device is disclosed which includes a frame, at least one optics holding member, and a control unit. The frame may be adapted for wearing by a user. The at least one optics holding member may include at least one illuminator configured to selectively illuminate at least a portion of at least one eye of the user, and at least one image sensor configured to capture image data representing images of at least a portion of at least one eye of the user. The control unit may be configured to control the at least one illuminator for the selective illumination of at least a portion of at least one eye of the user, receive the image data from the at least one image sensor, and calibrate at least one illuminator, at least one image sensor, or an algorithm of the control unit.Type: GrantFiled: July 30, 2020Date of Patent: May 31, 2022Assignee: Tobii ABInventors: Simon Gustafsson, Anders Kingbäck, Markus Cederlund
-
Publication number: 20220147141Abstract: A method for mapping an input device to a virtual object in virtual space displayed on a display device is disclosed. The method may include determining, via an eye tracking device, a gaze direction of a user. The method may also include, based at least in part on the gaze direction being directed to a virtual object in virtual space displayed on a display device, modifying an action to be taken by one or more processors in response to receiving a first input from an input device. The method may further include, thereafter, in response to receiving the input from the input device, causing the action to occur, wherein the action correlates the first input to an interaction with the virtual object.Type: ApplicationFiled: July 27, 2021Publication date: May 12, 2022Applicant: Tobii ABInventors: Simon Gustafsson, Alexey Bezugly, Anders Kingback, Anders Clausen
-
Patent number: 11327313Abstract: The present disclosure relates to a method for displaying an image with a specific depth of field. The method comprises the steps of obtaining information data related to a focal distance adapted to a user gazing at a display, determining a pupil size of said user, estimating a depth of field of said user's eyes based on said focal distance and said pupil size, and rendering an image based on said depth of field to be displayed on said display. Further, the present disclosure relates to a system, a head-mounted display and a non-transitory computer readable medium.Type: GrantFiled: March 30, 2020Date of Patent: May 10, 2022Assignee: Tobii ABInventor: Denny Rönngren
-
Publication number: 20220137704Abstract: Images of an eye are captured by a camera. For each of the images, gaze data is obtained and a position of a pupil center is estimated in the image. The gaze data indicates a gaze point and/or gaze direction of the eye when the image was captured. A mapping is calibrated using the obtained gaze data and the estimated positions of the pupil center. The mapping maps positions of the pupil center in images captured by the camera to gaze points at a surface, or to gaze directions. A further image of the eye is captured by the camera. A position of the pupil center is estimated in the further image. Gaze tracking is performed using the calibrated mapping and the estimated position of the pupil center in the further image. These steps may for example be performed at a HMD.Type: ApplicationFiled: January 13, 2022Publication date: May 5, 2022Applicant: Tobii ABInventors: Tiesheng Wang, Gilfredo Remon Salazar, Yimu Wang, Pravin Kumar Rana, Johannes Kron, Mark Ryan, Torbjorn Sundberg
-
Patent number: 11320974Abstract: Visualizable data are obtained that represent a scene with at least one object. The visualizable data describe the scene as seen from a position. First and second measures are determined, which represent extensions of one of the objects in a smallest and a largest dimension respectively. An object aspect ratio is calculated that represents a relationship between the first and second measures. Based on the object aspect ratio, a selection margin is assigned to the object. The selection margin designates a zone outside of the object within which zone the object is validly selectable for manipulation in addition to an area of the object shown towards a view thereof as seen from the position. Thus, it is made easier to manipulatable the visualizable data in response to user input, for instance in the form of gaze-based selection commands.Type: GrantFiled: March 31, 2021Date of Patent: May 3, 2022Assignee: Tobii ABInventors: Robin Thunström, Staffan Widegarn Åhlvik
-
Publication number: 20220129067Abstract: A system and a method for determining a gaze direction of a user viewing a scene is provided. The system comprises a camera for obtaining an image of at least one of the user's eyes, a depth information detection device for obtaining depth data related to the image, and a processing unit. The processing unit is configured to define a surface at a predetermined position relative to the user's eye based on the depth data, obtain a normalized image by projecting the image onto the surface and determine a normalized gaze direction based on the normalized image. The processing unit is further configured to determine a gaze direction based on the normalized gaze direction and the depth data.Type: ApplicationFiled: March 29, 2018Publication date: April 28, 2022Applicant: Tobii ABInventors: Oscar Danielsson, Daniel Johansson Torneus
-
Publication number: 20220130107Abstract: A method for determining a focus target of a user's gaze in a three-dimensional (“3D”) scene is disclosed. The method may include determining a first gaze direction of a user into a 3D scene, where the 3D scene includes a plurality of components. The method may also include executing a first plurality of line traces in the 3D scene, where each of the first plurality of Line traces is in proximity to the first gaze direction. The method may further include determining a confidence value for each component intersected by at least one of the first plurality of line traces. The method may additionally include identifying as a focus target of the user the component having the highest confidence value of all components intersected by at least one of the first plurality of line traces.Type: ApplicationFiled: January 11, 2022Publication date: April 28, 2022Applicant: Tobii ABInventor: Fredrik Lindh
-
Patent number: 11308321Abstract: There is provided a method, system, and non-transitory computer-readable storage medium for performing three-dimensional, 3D, position estimation for the cornea center of an eye of a user, using a remote eye tracking system, wherein the position estimation is reliable and robust also when the cornea center moves over time in relation to an imaging device associated with the eye tracking system. This is accomplished by generating, using, and optionally also updating, a cornea movement filter, CMF, in the cornea center position estimation.Type: GrantFiled: June 29, 2020Date of Patent: April 19, 2022Assignee: Tobii ABInventors: David Masko, Magnus Ivarsson, Niklas Ollesson, Anna Redz
-
Patent number: 11294460Abstract: A method for detecting an eye event of a user using an eye tracking system, the method comprising capturing a first image of a first eye of a user, capturing an image of a second eye of the user a first period after capturing the first image of the first eye and a second period before capturing a next image of the first eye, capturing a second image of the first eye the second period after capturing the image of the second eye, determining that an eye event has occurred based on a difference between the first and second images of the first eye, and performing at least one action if it is determined that that an eye event has occurred.Type: GrantFiled: December 10, 2020Date of Patent: April 5, 2022Assignee: Tobii ABInventor: Andreas Klingström
-
Publication number: 20220100455Abstract: The present invention generally relates to systems and methods for interaction with devices containing dual displays, and in particular, to systems and methods for enabling or altering the functionality of a secondary display based on a user's attention.Type: ApplicationFiled: July 1, 2020Publication date: March 31, 2022Applicant: Tobii ABInventors: Sourabh PATERIYA, Deepak Akkil, Onur Kurt, Erland George-Svahn
-
Publication number: 20220083799Abstract: An eye tracking system is provided that detects the presence of problematic blobs in an image captured by the system and removes these problematic blobs by switching off illuminators. Problematic blobs may be those obscuring the pupil of the eye of the user. Each blob is detected in a first image by the use of at least one first criterion, and then an illuminator is switched off. After the illuminator is switched off, at least one second criterion is used to identify blobs in a subsequent image. This process may be repeated until the illuminator causing the problematic blob is identified.Type: ApplicationFiled: August 27, 2021Publication date: March 17, 2022Applicant: Tobii ABInventors: Viktor Wase, Erik Ljungzell, Mark Ryan, Chiara Giordano, Rickard Lundahl, Pravin Kumar Rana
-
Patent number: 11275437Abstract: A method and a corresponding apparatus for mitigating motion sickness in a virtual reality VR/augmented reality (AR) system using a head mounted display (HMD) are disclosed. The method comprises receiving data from a sensor indicating a current orientation of the HMD in real space, and superimposing a visual indication on a display of the HMD. The visual indication provides visual information to a user of the current orientation of the HMD in real space. Furthermore, methods and corresponding apparatuses are disclosed of calculating gaze convergence distance in an eye tracking system, and of gaze based virtual reality (VR)/augmented reality (AR) menu expansion.Type: GrantFiled: October 19, 2020Date of Patent: March 15, 2022Assignee: Tobii ABInventors: Andreas Klingström, Per Fogelström, Andrew Ratcliff