Patents by Inventor Marten Skogo

Marten Skogo has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10313587
    Abstract: An imaging device adapted to provide eye-tracking data by imaging at least one eye of a viewer, wherein: the imaging device is switchable between at least an active mode and a ready mode; the imaging device is configured, in the active mode, to use active eye illumination, which enables tracking of a corneal reflection, and to provide eye tracking data which include eye position and eye orientation; and the imaging device is configured, in the ready mode, to reduce an illumination intensity from a value the illumination intensity has in the active mode.
    Type: Grant
    Filed: December 5, 2017
    Date of Patent: June 4, 2019
    Assignee: Tobii AB
    Inventors: Henrik Eskilsson, Mårten Skogö, John Elvesjö, Peter Blixt
  • Publication number: 20190138090
    Abstract: According to the invention, a method for modifying operation of at least one system of a vehicle based at least in part on a gaze direction of a driver is disclosed. The method may include receiving gaze data indicative of the gaze direction of a driver of a vehicle. The method may also include modifying operation of at least one system of the vehicle based at least in part on the gaze data.
    Type: Application
    Filed: July 30, 2018
    Publication date: May 9, 2019
    Applicant: Tobii AB
    Inventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
  • Patent number: 10282608
    Abstract: At least one image registering unit records at least one series of images representing a subject. A control unit controls an operation sequence for the at least one image registering unit in such a manner that a subsequent data processing unit receives a repeating sequence of image frames there from, wherein each period contains at least one image frame of a first resolution and at least one image frame of a second resolution being different from the first resolution. Based on the registered image frames, the data processing unit produces eye/gaze tracking data with respect to the subject.
    Type: Grant
    Filed: April 5, 2017
    Date of Patent: May 7, 2019
    Assignee: TOBII AB
    Inventors: Mattias Kuldkepp, Marten Skogo, Mattias Hanqvist, Martin Brogren, Dineshkumar Muthusamy
  • Publication number: 20190121429
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: December 11, 2018
    Publication date: April 25, 2019
    Applicant: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20190108383
    Abstract: Circuitry of a gaze/eye tracking system obtains one or more images of a left eye and one or more images a right eye, determines a gaze direction of the left eye based on at least one obtained image of the left eye, determines a gaze direction of the right eye based on at least one obtained image of the right eye, determines a first confidence value based on the one or more obtained images of the left eye, determines a second confidence value based on the one or more obtained images of the right eye, and determines a final gaze direction based at least in part on the first confidence value and the second confidence value. The first and second confidence values represent indications of the reliability of the determined gaze directions of the left eye and the right eye, respectively. Corresponding methods and computer-readable media are also provided.
    Type: Application
    Filed: April 3, 2018
    Publication date: April 11, 2019
    Applicant: Tobii AB
    Inventors: Andreas Klingström, Mattias Kuldkepp, Mårten Skogö
  • Publication number: 20190102905
    Abstract: Head pose information may be determined using information describing a fixed gaze and image data corresponding to a user's eyes. The head pose information may be determined in a manner that is disregards facial features with the exception of the user's eyes. The head pose information may be useable to interact with a user device.
    Type: Application
    Filed: September 28, 2018
    Publication date: April 4, 2019
    Applicant: Tobii AB
    Inventor: Mårten Skogö
  • Patent number: 10228763
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: September 12, 2018
    Date of Patent: March 12, 2019
    Assignee: Tobii AB
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20190064513
    Abstract: According to the invention, a system for tracking a gaze of a user across a multi-display arrangement is disclosed. The system may include a first display, a first eye tracking device, a second display, a second eye tracking device, and a processor. The first eye tracking device may be configured to determine a user's gaze direction while the user is gazing at the first display. The second eye tracking device may be configured to determine the user's gaze direction while the user is gazing at the second display. The processor may be configured to determine that the user's gaze has moved away from the first display in a direction of the second display, and in response to determining that the user's gaze has moved away from the first display in the direction of the second display, deactivate the first eye tracking device, and activate the second eye tracking device.
    Type: Application
    Filed: August 31, 2017
    Publication date: February 28, 2019
    Applicant: Tobii AB
    Inventors: Farshid Bagherpour, Mårten Skogö
  • Patent number: 10212343
    Abstract: An imaging device adapted to provide eye-tracking data by imaging at least one eye of a viewer, wherein: the imaging device is switchable between at least an active mode and a ready mode; the imaging device is configured to use active eye illumination in the active mode, which enables tracking of a corneal reflection; and the imaging device is configured, in the ready mode, to reduce an illumination intensity from a value the illumination intensity has in the active mode, and to provide eye-tracking data which include eye position but not eye orientation.
    Type: Grant
    Filed: July 6, 2016
    Date of Patent: February 19, 2019
    Assignee: Tobii AB
    Inventors: Henrik Eskilsson, Mårten Skogö, John Elvesjö, Peter Blixt
  • Patent number: 10203758
    Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.
    Type: Grant
    Filed: August 6, 2013
    Date of Patent: February 12, 2019
    Assignee: Tobii AB
    Inventors: Christoffer Björklund, Henrik Eskilsson, Magnus Jacobson, Mårten Skogö
  • Patent number: 10198070
    Abstract: A gaze tracking system, leaving a low power mode in response to an activation signal, captures an initial burst of eye pictures in short time by restricting the image area of a sensor, with the purpose of enabling an increased frame rate. Subsequent eye pictures are captured at normal frame rate. The first gaze point value is computed memorylessly based on the initial burst of eye pictures and no additional imagery, while subsequent values may be computed recursively by taking into account previous gaze point values or information from previous eye pictures. The restriction of the image area may be guided by a preliminary overview picture captured using the same or a different sensor. From the gaze point values, the system may derive a control signal to be supplied to a computer device with a visual display.
    Type: Grant
    Filed: September 18, 2017
    Date of Patent: February 5, 2019
    Assignee: Tobii AB
    Inventors: Mårten Skogö, Anders Olsson, John Mikael Elvesjö, Aron Yu
  • Patent number: 10192109
    Abstract: According to the invention a system for authenticating a user of a device is disclosed. The system may include a first image sensor, a determination unit, and an authentication unit. The first image sensor may be for capturing at least one image of at least part of a user. The determination unit may be for determining information relating to the user's eye based at least in part on at least one image captured by the first image sensor. The authentication unit may be for authenticating the user using the information relating to the user's eye.
    Type: Grant
    Filed: April 18, 2016
    Date of Patent: January 29, 2019
    Assignee: Tobii AB
    Inventors: Mårten Skogö, Richard Hainzl, Henrik Jönsson, Anders Vennström, Erland George-Svahn, John Elvesjö
  • Publication number: 20190026369
    Abstract: The invention generally relates to computer implemented systems and methods for utilizing detection of eye movements in connection with interactive graphical user interfaces and, in particular, for utilizing eye tracking data to facilitate and improve information search and presentation in connection with interactive graphical user interfaces. A gaze search index is determined based on information that has been presented on an information presentation area and gaze data signals. The gaze search index comprises links between gaze search parameters and presented information that satisfies gaze filter criteria for respective gaze search parameter. A user can initiate query searches for information on the computer device or on information hosts connectable to the computer device via networks by using combinations of gaze search parameters of the gaze search index and non gaze search parameters of a non gaze search index.
    Type: Application
    Filed: August 1, 2018
    Publication date: January 24, 2019
    Applicant: Tobii AB
    Inventors: Anders Olsson, Mårten Skogö
  • Publication number: 20190025932
    Abstract: A portable computing device is disclosed which may include a base element, a lid element, a first motion sensor, a second motion sensor, a processor, and an eye tracking system. The first motion sensor may be disposed in the base element. The second motion sensor may be disposed in the lid element. The processor may be configured to control the first motion sensor to detect first motion information, control the second motion sensor to detect second motion information; and determine final motion information based at least in part on the first motion information and the second motion information. The eye tracking system may be configured to determine a gaze position of a user based at least in part on the final motion information, wherein the processor is further configured to execute one or more control processes based at least in part on the determined gaze position meeting a predetermined condition.
    Type: Application
    Filed: July 30, 2018
    Publication date: January 24, 2019
    Applicant: Tobii AB
    Inventors: Mårten Skogö, John Elvesjö, Jan-Erik Lundkvist, Per Lundberg
  • Patent number: 10185394
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: December 11, 2017
    Date of Patent: January 22, 2019
    Assignee: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20190011986
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: September 12, 2018
    Publication date: January 10, 2019
    Applicant: Tobii AB
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20180364802
    Abstract: A control module for generating gesture based commands during user interaction with an information presentation area is provided. The control module is configured to acquire user input from a touchpad and gaze data signals from a gaze tracking module; and determine at least one user generated gesture based control command based on a user removing contact of a finger of the user with the touchpad; determine a gaze point area on the information presentation area including the user's gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on the graphical information presentation area based on the determined gaze point area and at least one user generated gesture based control command, wherein the user action is executed at said determined gaze point area.
    Type: Application
    Filed: June 14, 2018
    Publication date: December 20, 2018
    Applicant: Tobii AB
    Inventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö, Erland George-Svahn
  • Publication number: 20180349698
    Abstract: A system for determining a gaze direction of a user is disclosed. The system may include a first illuminator, a first profile sensor, and at least one processor. The first illuminator may be configured to illuminate an eye of a user. The first profile sensor may be configured to detect light reflected by the eye of the user. The processor(s) may be configured to determine a gaze direction of the user based at least in part on light detected by the first profile sensor.
    Type: Application
    Filed: May 18, 2018
    Publication date: December 6, 2018
    Applicant: Tobii AB
    Inventors: Simon Gustafsson, Daniel Johansson Tornéus, Jonas Andersson, Johannes Kron, Mårten Skogö, Ralf Biedert
  • Patent number: 10146307
    Abstract: A graphics presentation apparatus including a display unit, an eye-tracking module, and a data output module. The eye-tracking module registers image data representing at least one eye of a user of the apparatus. Furthermore, the eye-tracking module determines, based on the registered image data, an orientation of the at least one eye relative to the display unit. Finally, in response thereto, the eye-tracking module generates a control signal controlling the data output module to produce visual content with such orientation on the display unit that a misalignment between the orientation of said at least one part and the orientation of the at least one eye of the user is minimized.
    Type: Grant
    Filed: June 20, 2017
    Date of Patent: December 4, 2018
    Assignee: Tobii AB
    Inventors: Aron Yu, Marten Skogo, Robert Gavelin, Per Nystedt
  • Publication number: 20180335838
    Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.
    Type: Application
    Filed: May 21, 2018
    Publication date: November 22, 2018
    Applicant: Tobii AB
    Inventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö