Patents Assigned to TOBII AB
  • Publication number: 20190011986
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: September 12, 2018
    Publication date: January 10, 2019
    Applicant: Tobii AB
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20190005735
    Abstract: Disclosed herein is a system comprising a display, at least one imaging device for capturing at least one user image of at least part of the user, a determination unit connected to the at least one imaging device, said determination unit being used for determining information relating to the user's eye based on the at least one user image. The system further comprises a scene renderer connected to the display and the determination unit, the scene renderer being used to generate a first image on the display, whereby the scene renderer is configured to generate at least a partially modified image relative to the first image in real time, without the user noticing, based on the information relating to the user's eye.
    Type: Application
    Filed: June 30, 2017
    Publication date: January 3, 2019
    Applicant: TOBII AB
    Inventor: Denny Rönngren
  • Publication number: 20180364802
    Abstract: A control module for generating gesture based commands during user interaction with an information presentation area is provided. The control module is configured to acquire user input from a touchpad and gaze data signals from a gaze tracking module; and determine at least one user generated gesture based control command based on a user removing contact of a finger of the user with the touchpad; determine a gaze point area on the information presentation area including the user's gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on the graphical information presentation area based on the determined gaze point area and at least one user generated gesture based control command, wherein the user action is executed at said determined gaze point area.
    Type: Application
    Filed: June 14, 2018
    Publication date: December 20, 2018
    Applicant: Tobii AB
    Inventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö, Erland George-Svahn
  • Patent number: 10152122
    Abstract: According to the invention, a method for changing a display based on a gaze point of a user on the display is disclosed. The method may include determining a gaze point of a user on a display. The method may also include causing a first area of the display to be displayed in a first manner, the first area including the gaze point and a surrounding area. The method may further include causing a second area of the display to be displayed in a second manner, the second area being different than the first area, and the second manner being different than the first manner.
    Type: Grant
    Filed: July 18, 2016
    Date of Patent: December 11, 2018
    Assignee: Tobii AB
    Inventor: Andreas Klingström
  • Publication number: 20180349698
    Abstract: A system for determining a gaze direction of a user is disclosed. The system may include a first illuminator, a first profile sensor, and at least one processor. The first illuminator may be configured to illuminate an eye of a user. The first profile sensor may be configured to detect light reflected by the eye of the user. The processor(s) may be configured to determine a gaze direction of the user based at least in part on light detected by the first profile sensor.
    Type: Application
    Filed: May 18, 2018
    Publication date: December 6, 2018
    Applicant: Tobii AB
    Inventors: Simon Gustafsson, Daniel Johansson Tornéus, Jonas Andersson, Johannes Kron, Mårten Skogö, Ralf Biedert
  • Patent number: 10146307
    Abstract: A graphics presentation apparatus including a display unit, an eye-tracking module, and a data output module. The eye-tracking module registers image data representing at least one eye of a user of the apparatus. Furthermore, the eye-tracking module determines, based on the registered image data, an orientation of the at least one eye relative to the display unit. Finally, in response thereto, the eye-tracking module generates a control signal controlling the data output module to produce visual content with such orientation on the display unit that a misalignment between the orientation of said at least one part and the orientation of the at least one eye of the user is minimized.
    Type: Grant
    Filed: June 20, 2017
    Date of Patent: December 4, 2018
    Assignee: Tobii AB
    Inventors: Aron Yu, Marten Skogo, Robert Gavelin, Per Nystedt
  • Publication number: 20180335838
    Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.
    Type: Application
    Filed: May 21, 2018
    Publication date: November 22, 2018
    Applicant: Tobii AB
    Inventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö
  • Publication number: 20180329510
    Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.
    Type: Application
    Filed: May 8, 2018
    Publication date: November 15, 2018
    Applicant: Tobii AB
    Inventors: Christoffer Björklund, Henrik Eskilsson, Magnus Jacobson, Mårten Skogö
  • Publication number: 20180321903
    Abstract: According to the invention, a method for providing audio to a user is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user on a display. The method may also include causing, with a computer system, an audio device to produce audio to the user, where content of the audio may be based at least in part on the gaze point of the user on the display.
    Type: Application
    Filed: July 10, 2018
    Publication date: November 8, 2018
    Applicant: Tobii AB
    Inventors: Anders Vennström, Fredrik Lindh
  • Patent number: 10116846
    Abstract: According to the invention, a system for converting sound to electrical signals is disclosed. The system may include a gaze tracking device and a microphone. The gaze tracking device may determine a gaze direction of a user. The microphone may be more sensitive in a selected direction than at least one other direction and alter the selected direction based at least in part on the gaze direction determined by the gaze tracking device.
    Type: Grant
    Filed: March 8, 2017
    Date of Patent: October 30, 2018
    Assignee: Tobii AB
    Inventors: David Figgins Henderek, Mårten Skogö
  • Patent number: 10114459
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: November 10, 2017
    Date of Patent: October 30, 2018
    Assignee: Tobii AB
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20180307905
    Abstract: A method for mapping an input device to a virtual object in virtual space displayed on a display device is disclosed. The method may include determining, via an eye tracking device, a gaze direction of a user. The method may also include, based at least in part on the gaze direction being directed to a virtual object in virtual space displayed on a display device, modifying an action to be taken by one or more processors in response to receiving a first input from an input device. The method may further include, thereafter, in response to receiving the input from the input device, causing the action to occur, wherein the action correlates the first input to an interaction with the virtual object.
    Type: Application
    Filed: March 20, 2018
    Publication date: October 25, 2018
    Applicant: Tobii AB
    Inventors: Simon Gustafsson, Alexey Bezugly, Anders Kingbäck, Anders Clausen
  • Publication number: 20180307324
    Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the, at least one GUI-component in accordance with each respective control signal request.
    Type: Application
    Filed: April 23, 2018
    Publication date: October 25, 2018
    Applicant: Tobii AB
    Inventors: Christoffer Björklund, Henrik Eskilsson, Magnus Jacobson, Mårten Skogö
  • Publication number: 20180300943
    Abstract: A method for determining a focus target of a user's gaze in a three-dimensional (“3D”) scene is disclosed. The method may include determining a first gaze direction of a user into a 3D scene, where the 3D scene includes a plurality of components. The method may also include executing a first plurality of line traces in the 3D scene, where each of the first plurality of line traces is in proximity to the first gaze direction. The method may further include determining a confidence value for each component intersected by at least one of the first plurality of line traces. The method may additionally include identifying as a focus target of the user the component having the highest confidence value of all components intersected by at least one of the first plurality of line traces.
    Type: Application
    Filed: March 30, 2018
    Publication date: October 18, 2018
    Applicant: Tobii AB
    Inventor: Fredrik Lindh
  • Patent number: 10082870
    Abstract: According to the invention, a system for presenting graphics on a display device is disclosed. The system may include an eye tracking device for determining a gaze point of a user on a display device. The system may also include a graphics processing device for causing graphics to be displayed on the display device. The graphics displayed on the display device may be modified such that the graphics in an area including the gaze point of the user have at least one modified parameter relative to graphics outside the area. The size of the area may be based at least in part on an amount of noise in the gaze point over time and at least one other secondary factor.
    Type: Grant
    Filed: March 31, 2017
    Date of Patent: September 25, 2018
    Assignee: Tobii AB
    Inventors: Robin Thunström, Märten Skogö, Denny Rönngren, Anders Clausen
  • Patent number: 10082864
    Abstract: According to the invention, a method for entering text into a computing device using gaze input from a user is disclosed. The method may include causing a display device to display a visual representation of a plurality of letters. The method may also include receiving gaze information identifying a movement of the user's gaze on the visual representation. The method may further include recording an observation sequence of one or more observation events that occur during the movement of the user's gaze on the visual representation. The method may additionally include providing the observation sequence to a decoder module. The decoder module may determine at least one word from the observation sequence representing an estimate of an intended text of the user.
    Type: Grant
    Filed: September 2, 2015
    Date of Patent: September 25, 2018
    Assignee: Tobii AB
    Inventors: Per Ola Kristensson, Keith Vertanen, Morten Mjelde
  • Publication number: 20180270436
    Abstract: Techniques for reducing a read out time and power consumption of an image sensor used for eye tracking are described. In an example, a position of an eye element in an active area of a sensor is determined. The eye element can be any of an eye, a pupil of the eye, an iris of the eye, or a glint at the eye. A region of interest (ROI) around the position of the eye is defined. The image sensor reads out pixels confined to the ROI, thereby generating an ROI image that shows the eye element.
    Type: Application
    Filed: May 10, 2018
    Publication date: September 20, 2018
    Applicant: Tobii AB
    Inventors: Magnus Ivarsson, Per-Edvin Stoltz, David Masko, Niklas Ollesson, Mårten Skogö, Peter Blixt, Henrik Jönsson
  • Publication number: 20180242841
    Abstract: A subject is illuminated whose movements of at least one eye are to be registered by an eye-tracker for producing resulting data. To this aim, a coherent light source in light producing means produces at least one well-defined light beam. A diffractive optical element is arranged between the coherent light source and an output from the light producing means. The diffractive optical element is configured to direct a light beam towards at least a first designated area estimated to cover the at least one eye of the subject. Image registering means registers light from the at least one light beam having been reflected against the subject. The resulting data is produced in response to these light reflections, which resulting data represents movements of the at least one eye of the subject.
    Type: Application
    Filed: January 10, 2018
    Publication date: August 30, 2018
    Applicant: Tobii AB
    Inventors: Richard Hainzl, Mattias Kuldkepp, Peter Blixt, Mårten Skogö
  • Publication number: 20180246320
    Abstract: A system for adjusting a position of a lens in a wearable device is provided. The system may include a wearable device and a processor. The wearable device may include a display, a lens movably disposed in front of the display, and an eye tracking device including an illuminator and an image sensor. The processor may receive data from the eye tracking device, and determine, based on the data, a first position of the eye of a user wearing the wearable device, where the position is relative to a second position which is fixed with respect to the lens. The processor may also be configured to determine a distance between the first position and the second position, and if the distance is greater than a first value, cause information to be presented to the user indicating that the lens should be moved to a more optimal position.
    Type: Application
    Filed: February 9, 2018
    Publication date: August 30, 2018
    Applicant: Tobii AB
    Inventors: Pravin Rana, Daniel Tornéus
  • Publication number: 20180247119
    Abstract: A method for determining eye openness with an eye tracking device is disclosed. The method may include determining, for pixels of an image sensor of an eye tracking device, during a first time period when an eye of a user is open, a first sum of intensity of the pixels. The method may also include determining, during a second time period when the eye of the user is closed, a second sum of intensity of the pixels. The method may further include determining, during a third time period, a third sum of intensity of the pixels. The method may additionally include determining that upon the third sum exceeding a fourth sum of the first sum plus a threshold amount, that the eye of the user is closed, the threshold amount is equal to a product of a threshold fraction and a difference between the first sum and the second sum.
    Type: Application
    Filed: February 9, 2018
    Publication date: August 30, 2018
    Applicant: Tobii AB
    Inventors: Mark Ryan, Torbjörn Sundberg, Pravin Rana, Yimu Wang