Patents Assigned to TOBII AB
-
Patent number: 11551376Abstract: There is provided a method and system for determining if a head-mounted device for extended reality (XR) is correctly positioned on a user, and optionally performing a position correction procedure if the head-mounted device is determined to be incorrectly positioned on the user. Embodiments include: performing eye tracking by estimating, based on a first image of a first eye of the user, a position of a pupil in two dimensions; determining whether the estimated position of the pupil of the first eye is within a predetermined allowable area in the first image; and, if the determined position of the pupil of the first eye is inside the predetermined allowable area, concluding that the head-mounted device is correctly positioned on the user; or, if the determined position of the pupil of the first eye is outside the predetermined allowable area, concluding that the head-mounted device is incorrectly positioned on the user.Type: GrantFiled: October 25, 2019Date of Patent: January 10, 2023Assignee: TOBII ABInventors: Joakim Zachrisson, Mikael Rosell, Carlos Pedreira, Mark Ryan, Simon Johansson
-
Patent number: 11493766Abstract: A method for controlling the transparency level of a transparent displaying device arranged to display one or more virtual objects, the method comprising the steps of: obtaining a gaze point of a user; obtaining a position of at least one virtual object displayed by the displaying device; determining whether the attention of the user is directed to the virtual object based on the obtained gaze point; and if so, adjusting a transparency level of the displaying device. A system operative for controlling the transparency level in a displaying device, as well as a displaying device comprising such a system is also disclosed.Type: GrantFiled: December 17, 2020Date of Patent: November 8, 2022Assignee: Tobii ABInventor: Henrik Andersson
-
Patent number: 11480804Abstract: Techniques for distributed foveated rendering based on user gaze are described. In an example, an end user device is communicatively coupled with a remote computer and presents images on a display based on gaze data. The user device receives a low resolution background image and high resolution foreground image from the remote computer based on the gaze data. The foreground image is constrained to a foveated region according to the gaze data. The end user device generates a composite image by scaling up the background image and overlaying the foreground image. The composite image is then presented on the display.Type: GrantFiled: July 20, 2019Date of Patent: October 25, 2022Assignee: TOBII ABInventor: Ritchie Brannan
-
Publication number: 20220326536Abstract: Disclosed is a method for switching user input modality of a displaying device displaying an interactable region. The displaying device is in communication with a first input modality and a second input modality. The first input modality is an eye tracker configured to determine gaze data of a user and the second input modality is an input modality other than an eye tracker configured to determine a pointing ray. The first input modality is selected as the input modality of the displaying device. The method comprises determining whether the pointing ray of the second input modality is intersecting with the interactable region. The method further comprises, based on the determining switching the input modality of the displaying device to the second input modality when the pointing ray of the second input modality is intersecting with the interactable region.Type: ApplicationFiled: May 23, 2022Publication date: October 13, 2022Applicant: Tobii ABInventors: Niklas Blomqvist, Robin Thunström, Dennis Rådell, Ralf Biedert
-
Publication number: 20220317768Abstract: The invention is related to a method and system for calibrating an eye tracking device configured to track a gaze point of a user on a display The method comprises: presenting a video on the display to a user, the video having a start size and a start position; tracking the gaze of the user, using an image sensor of the eye tracking device; and sequentially completing, for at least one calibration position, the steps of: resizing the video to a calibration size, wherein the calibration size is smaller than the start size, and translating the video to a calibration position; recording calibration data, using the eye tracking device, for the user viewing the video in the calibration position; and resizing the video to a second size that is greater than the start size.Type: ApplicationFiled: March 17, 2022Publication date: October 6, 2022Applicant: Tobii ABInventors: Sergey Slobodenyuk, Mikkel Rasmussen, Andreas Jansson, Thomas Gaudy, Evgeniia Farkhutdinova, Jonas Högström, Richard Andersson
-
Publication number: 20220313082Abstract: The invention is related to an eye tracking system for determining reference gaze data of a user in a scene exposing a pupil of the user.Type: ApplicationFiled: March 21, 2022Publication date: October 6, 2022Applicant: Tobii ABInventors: Henrik Andersson, Deepak Akkil
-
Publication number: 20220318953Abstract: A method for improved compression and filtering of images for foveated transport applications. The improvements including calculating along at least one axis of the full-resolution image data set, the distance from the foveal point to edges of full-resolution image data set, calculating the distance from each edge of the full resolution data set to the closest point on a foveal region surrounding the foveal point, calculating the distribution of available space in the adjacent peripheral regions of the image using a compression parameter, and calculating a compression curve wherein the compression of both adjacent peripheral regions is such that neither side of the adjacent peripheral regions are compressed more than the other.Type: ApplicationFiled: March 25, 2022Publication date: October 6, 2022Applicant: Tobii ABInventors: Alexey Bezugly, Dennis Rådell, Ritchie Brannan
-
Publication number: 20220301218Abstract: Head pose information may be determined using information describing a fixed gaze and image data corresponding to a user's eyes. The head pose information may be determined in a manner that is disregards facial features with the exception of the user's eyes. The head pose information may be usable to interact with a user device.Type: ApplicationFiled: March 23, 2021Publication date: September 22, 2022Applicant: Tobii ABInventor: Mårten Skogö
-
Patent number: 11443440Abstract: A computer-implemented method of selecting a sequence of images of a user's eye for an eye tracking application wherein each image is captured when a stationary stimulus point is displayed to the user, the method comprising: for a plurality of different pairs of the images: comparing the pair of images with each other to determine an image-score that represents a degree of difference between the compared images; and calculating a sequence-score based on the image-scores for the plurality of pairs of images.Type: GrantFiled: September 30, 2020Date of Patent: September 13, 2022Assignee: Tobii ABInventors: Mark Ryan, Oscar Lundqvist, Oscar Nyman
-
Patent number: 11438542Abstract: A computer implemented method for controlling read-out from a digital image sensor device, comprising a plurality of pixels, the method comprising the steps of setting a first read-out scheme, based on a first level of pixel binning and/or pixel skipping, reading, based on the first read-out scheme, from the digital image sensor device, a first image, determining an exposure value for the first image, based on the intensity value of each one of the first plurality of regions of the first image and comparing the exposure value with a predetermined maximum value. A second read-out scheme based on a second level of pixel binning and/or pixel skipping is set. The level of pixel binning and/or pixel skipping in the second read-out scheme is increased compared to the first read-out scheme, if the exposure value is higher than the predetermined maximum value. Based on the second read-out scheme, a subsequent second image is read. A system configured to perform the method is also described.Type: GrantFiled: June 19, 2020Date of Patent: September 6, 2022Assignee: Tobii ABInventors: Niklas Ollesson, Magnus Ivarsson, Viktor Åberg, Anna Redz
-
Publication number: 20220265142Abstract: A portable eye tracker device is disclosed which includes a frame, at least one optics holding member, and a control unit. The frame may be adapted for wearing by a user. The at least one optics holding member may include at least one illuminator configured to selectively illuminate at least a portion of at least one eye of the user, and at least one image sensor configured to capture image data representing images of at least a portion of at least one eye of the user. The control unit may be configured to control the at least one illuminator for the selective illumination of at least a portion of at least one eye of the user, receive the image data from the at least one image sensor, and calibrate at least one illuminator, at least one image sensor, or an algorithm of the control unit.Type: ApplicationFiled: May 12, 2022Publication date: August 25, 2022Applicant: Tobii ABInventors: Simon Gustafsson, Anders Kingbäck, Markus Cederlund
-
Patent number: 11423516Abstract: There is provided systems, methods and computer program products for generating motion blur on image frames, comprising: obtaining gaze data related to an eye movement between consecutive images, determining movement of at least one object in relation to said gaze data by calculating the difference in position of said at least one object and said gaze data between the image frames, forming a motion blur vector and applying a motion blur on an image frame based on said motion blur vector.Type: GrantFiled: March 30, 2020Date of Patent: August 23, 2022Assignee: Tobii ABInventor: Denny Rönngren
-
Publication number: 20220261079Abstract: Techniques for controlling light sources used in eye tracking are described. In an example, an eye tracking system generates a first image and a second image showing at least a portion of the user eye illuminated by a predetermined set of illuminators of the eye tracking system. The eye tracking system determines a first position of a glint in the first image and a second position of the glint in the second image. Each of the first position and the second position is relative to a pupil edge. The eye tracking system predicts a third position of the glint relative to the pupil edge based on the first position and the second position. Further, the eye tracking system determines, from the predetermined set, an illuminator that corresponds to the glint and determines, based on the third position, whether to power off the illuminator to generate a third image of at least the portion of the user eye.Type: ApplicationFiled: May 9, 2022Publication date: August 18, 2022Applicant: Tobii ABInventors: Daniel Johansson Tornéus, Andreas Klingstrom, Martin Skarback
-
Publication number: 20220253134Abstract: A computer system can be controlled with non-contact inputs through zonal control. In an embodiment, a non-contact input that is an eye-tracking device is used to track the gaze of a user. A computer's display, and beyond, can be separated into a number of discrete zones according to a configuration. Each zone is associated with a computer function. The zones and/or their functions can, but need not, be indicated to the user. The user can perform the various functions by moving gaze towards the zone associated with that function and providing an activation signal of intent. The activation signal of intent can be a contact-required or non-contact action, such as a button press or dwelling gaze, respectively.Type: ApplicationFiled: December 16, 2019Publication date: August 11, 2022Applicant: Tobii ABInventors: David Figgins Henderek, Anders Olsson, Magnus Carl Olof Sävmarker, Staffan Wingren
-
Publication number: 20220245288Abstract: Computer display privacy and security for computer systems. In one aspect, the invention provides a computer-controlled system for regulating the interaction between a computer and a user of the computer based on the environment of the computer and the user. For example, the computer-controlled system provided by the invention comprises an input-output device including an image sensor configured to collect facial recognition data proximate to the computer. The system also includes a user security parameter database encoding security parameters associated with the user; the database is also configured to communicate with the security processor. The security processor is configured to receive the facial recognition data and the security parameters associated with the user, and is further configured to at least partially control the operation of the data input device and the data output device in response to the facial recognition data and the security parameters associated with the user.Type: ApplicationFiled: April 22, 2022Publication date: August 4, 2022Applicant: Tobii ABInventors: William R. Anderson, Steven E. Turner, Steven Pujia
-
Publication number: 20220237849Abstract: Method for reducing processor load in a system rendering a virtual scene to produce a rendered presentation of said virtual scene, which scene comprises at least one animated object, wherein the system performs said rendering based on said virtual scene which in turn is animated by the system based on a set of predefined animation rules, wherein the method comprises the steps: determining, based on information from a gaze direction detection means , a first zone or point of the virtual scene as a zone to which a gaze of the user is currently directed; determining a relative location or distance of a first object as a location in relation to said first zone or point; and modifying the value of an animation updating frequency of said first object per-formed by the system as a function of said determined relative location or distance. The invention also relates to a system and to a computer software function.Type: ApplicationFiled: June 29, 2020Publication date: July 28, 2022Applicant: Tobii ABInventor: Fredrik Lindh
-
Publication number: 20220227082Abstract: A method of spacing an object from a mould for the formation of a multi-layered polymer. The method comprises: attaching at least one spacer to the object; attaching the at least one spacer to the mould; and adding material to the mould to form the multi-layered polymer.Type: ApplicationFiled: January 21, 2022Publication date: July 21, 2022Applicant: Tobii ABInventors: Lutz Körner, Marcin Krajewski, Daniel Piotrowski
-
Publication number: 20220229491Abstract: A method for detecting an eye event of a user using an eye tracking system, the method comprising capturing a first image of a first eye of a user, capturing an image of a second eye of the user a first period after capturing the first image of the first eye and a second period before capturing a next image of the first eye, capturing a second image of the first eye the second period after capturing the image of the second eye, determining that an eye event has occurred based on a difference between the first and second images of the first eye, and performing at least one action if it is determined that that an eye event has occurred.Type: ApplicationFiled: April 5, 2022Publication date: July 21, 2022Applicant: Tobii ABInventor: Andreas Klingstrom
-
Publication number: 20220229542Abstract: Visualizable data (Din) are obtained that represent a scene (S) with at least one object (110, 120, 130). The visualizable data (Din) describe the scene (S) as seen from a position (P). First and second measures (L1; L2) are determined, which represent extensions of one of the objects (110) in a smallest and a largest dimension respectively. An object aspect ratio (R) is calculated that represents a relationship between the first and second measures (L1; L2). Based on the object aspect ratio (R), a selection margin (M) is assigned to the object (110). The selection margin designates a zone outside of the object (110) within which zone the object (110) is validly selectable for manipulation in addition to an area (A11) of the object (110) shown towards a view (V) thereof as seen from the position (P). Thus, it is made easier to manipulatable the visualizable data (Din) in response to user input, for instance in the form of gaze-based selection commands.Type: ApplicationFiled: April 5, 2022Publication date: July 21, 2022Applicant: Tobii ABInventors: Robin Thunstrom, Staffan Widegarn Ahlvik
-
Patent number: 11386290Abstract: A method for training an eye tracking model is disclosed, as well as a corresponding system and storage medium. The eye tracking model is adapted to predict eye tracking data based on sensor data from a first eye tracking sensor. The method comprises receiving sensor data obtained by the first eye tracking sensor at a time instance and receiving reference eye tracking data for the time instance generated by an eye tracking system comprising a second eye tracking sensor. The reference eye tracking data is generated by the eye tracking system based on sensor data obtained by the second eye tracking sensor at the time instance. The method comprises training the eye tracking model based on the sensor data obtained by the first eye tracking sensor at the time instance and the generated reference eye tracking data.Type: GrantFiled: March 30, 2020Date of Patent: July 12, 2022Assignee: Tobii ABInventors: Carl Asplund, Patrik Barkman, Anders Dahl, Oscar Danielsson, Tommaso Martini, Mårten Nilsson