Patents by Inventor Mårten Skogö

Mårten Skogö has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10035518
    Abstract: According to the invention, a method for modifying operation of at least one system of a vehicle based at least in part on a gaze direction of a driver is disclosed. The method may include receiving gaze data indicative of the gaze direction of a driver of a vehicle. The method may also include modifying operation of at least one system of the vehicle based at least in part on the gaze data.
    Type: Grant
    Filed: February 23, 2017
    Date of Patent: July 31, 2018
    Assignee: Tobii AB
    Inventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
  • Patent number: 10025381
    Abstract: A control module for generating gesture based commands during user interaction with an information presentation area is provided. The control module is configured to acquire user input from a touchpad and gaze data signals from a gaze tracking module; and determine at least one user generated gesture based control command based on a user removing contact of a finger of the user with the touchpad; determine a gaze point area on the information presentation area including the user's gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on the graphical information presentation area based on the determined gaze point area and at least one user generated gesture based control command, wherein the user action is executed at said determined gaze point area.
    Type: Grant
    Filed: January 20, 2015
    Date of Patent: July 17, 2018
    Assignee: Tobii AB
    Inventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö, Erland George-Svahn
  • Patent number: 10025389
    Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.
    Type: Grant
    Filed: December 22, 2011
    Date of Patent: July 17, 2018
    Assignee: Tobii AB
    Inventors: Christoffer Björklund, Henrik Eskilsson, Magnus Jacobson, Mårten Skogö
  • Patent number: 10013053
    Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.
    Type: Grant
    Filed: October 5, 2012
    Date of Patent: July 3, 2018
    Assignee: Tobii AB
    Inventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö
  • Publication number: 20180181272
    Abstract: Disclosed are various embodiments for automatic scrolling of content displayed on a display device in response to gaze detection. Content may be displayed in a window rendered on a display screen. Gaze detection components may be used to detect that a user is gazing at the displayed content and to determine a gaze point relative to the display screen. At least one applicable scroll zone relative to the display screen and a scroll action associated with each applicable scroll zone may be determined. In response to determining that the gaze point is within a first applicable scroll zone, an associated first scroll action may be initiated. The first scroll action causes the content to scroll within the window until at least one of: expiration of a defined period, determining that a portion of the content scrolls past a defined position within the window, determining that the gaze point is outside of the first scroll zone, and detecting an indicator that the user begins reading the content.
    Type: Application
    Filed: December 21, 2017
    Publication date: June 28, 2018
    Applicant: Tobii AB
    Inventors: Anders Olsson, Mårten Skogö
  • Patent number: 9996159
    Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.
    Type: Grant
    Filed: August 6, 2013
    Date of Patent: June 12, 2018
    Assignee: Tobii AB
    Inventors: Christoffer Björklund, Henrik Eskilsson, Magnus Jacobson, Mårten Skogö
  • Publication number: 20180157323
    Abstract: There is provided methods, systems and computer program products for identifying an interaction element from one or more interaction elements present in a user interface, comprising: receiving gaze information from an eye tracking system; determining the likelihood that an interaction element is the subject of the gaze information from the eye tracking system; and identifying an interaction element from the one or more interaction elements based on said likelihood determination.
    Type: Application
    Filed: November 27, 2017
    Publication date: June 7, 2018
    Applicant: Tobii AB
    Inventors: Anders Vennström, Mårten Skogö, John Elvesjö
  • Patent number: 9977960
    Abstract: A system for determining a gaze direction of a user is disclosed. The system may include a first illuminator, a first profile sensor, and at least one processor. The first illuminator may be configured to illuminate an eye of a user. The first profile sensor may be configured to detect light reflected by the eye of the user. The processor(s) may be configured to determine a gaze direction of the user based at least in part on light detected by the first profile sensor.
    Type: Grant
    Filed: September 26, 2016
    Date of Patent: May 22, 2018
    Assignee: Tobii AB
    Inventors: Simon Gustafsson, Daniel Johansson Tornéus, Jonas Andersson, Johannes Kron, Mårten Skogö, Ralf Biedert
  • Publication number: 20180129280
    Abstract: According to the invention, a system for presenting graphics on a display device is disclosed. The system may include an eye tracking device, at least one processor, and a graphics processing device. The eye tracking device may be for determining at least one of a gaze point of a user on a display device, or a change in the gaze point of the user on the display device. The processor may be for receiving data from the eye tracking device and modifying use of at least one system resource based at least in part on the data received from the eye tracking device. The graphics processing device may be for causing an image to be displayed on the display device.
    Type: Application
    Filed: May 30, 2017
    Publication date: May 10, 2018
    Applicant: Tobii AB
    Inventors: Mårten Skogö, Anders Olsson, Erik Kenneth Holmgren
  • Publication number: 20180129281
    Abstract: A gaze tracking system, leaving a low power mode in response to an activation signal, captures an initial burst of eye pictures in short time by restricting the image area of a sensor, with the purpose of enabling an increased frame rate. Subsequent eye pictures are captured at normal frame rate. The first gaze point value is computed memorylessly based on the initial burst of eye pictures and no additional imagery, while subsequent values may be computed recursively by taking into account previous gaze point values or information from previous eye pictures. The restriction of the image area may be guided by a preliminary overview picture captured using the same or a different sensor. From the gaze point values, the system may derive a control signal to be supplied to a computer device with a visual display.
    Type: Application
    Filed: September 18, 2017
    Publication date: May 10, 2018
    Applicant: Tobii AB
    Inventors: Mårten Skogö, Anders Olsson, John Mikael Elvesjö, Aron Yu
  • Patent number: 9952672
    Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.
    Type: Grant
    Filed: August 6, 2013
    Date of Patent: April 24, 2018
    Assignee: Tobii AB
    Inventors: Christoffer Björklund, Henrik Eskilsson, Magnus Jacobson, Mårten Skogö
  • Publication number: 20180103202
    Abstract: An imaging device adapted to provide eye-tracking data by imaging at least one eye of a viewer, wherein: the imaging device is switchable between at least an active mode and a ready mode; the imaging device is configured, in the active mode, to use active eye illumination, which enables tracking of a corneal reflection, and to provide eye tracking data which include eye position and eye orientation; and the imaging device is configured, in the ready mode, to reduce an illumination intensity from a value the illumination intensity has in the active mode.
    Type: Application
    Filed: December 5, 2017
    Publication date: April 12, 2018
    Applicant: Tobii AB
    Inventors: Henrik Eskilsson, Mårten Skogö, John Elvesjö, Peter Blixt
  • Publication number: 20180101228
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: December 11, 2017
    Publication date: April 12, 2018
    Applicant: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Patent number: 9940518
    Abstract: Circuitry of a gaze/eye tracking system obtains one or more images of a left eye and one or more images a right eye, determines a gaze direction of the left eye based on at least one obtained image of the left eye, determines a gaze direction of the right eye based on at least one obtained image of the right eye, determines a first confidence value based on the one or more obtained images of the left eye, determines a second confidence value based on the one or more obtained images of the right eye, and determines a final gaze direction based at least in part on the first confidence value and the second confidence value. The first and second confidence values represent indications of the reliability of the determined gaze directions of the left eye and the right eye, respectively. Corresponding methods and computer-readable media are also provided.
    Type: Grant
    Filed: September 11, 2017
    Date of Patent: April 10, 2018
    Assignee: Tobii AB
    Inventors: Andreas Klingström, Mattias Kuldkepp, Mårten Skogö
  • Publication number: 20180088668
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: November 10, 2017
    Publication date: March 29, 2018
    Applicant: Tobii AB
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Patent number: 9924864
    Abstract: A subject (180) is illuminated whose movements of at least one eye are to be registered by an eye-tracker for producing resulting data (DEYE). To this aim, a coherent light source in light producing means (140) produces at least one well-defined light beam (L1, L2). A diffractive optical element is arranged between the coherent light source and an output from the light producing means (140). The diffractive optical element is configured to direct a light beam (L1) towards at least a first designated area (A1) estimated to cover the at least one eye of the subject (180). Image registering means (150) registers light from the at least one light beam (L1, L2) having been reflected against the subject (180). The resulting data (DEYE) are produced in response to these light reflections, which resulting data (DEYE) represent movements of the at least one eye of the subject (180).
    Type: Grant
    Filed: June 14, 2012
    Date of Patent: March 27, 2018
    Assignee: Tobii AB
    Inventors: Richard Hainzl, Mattias Kuldkepp, Peter Blixt, Mårten Skogö
  • Patent number: 9898081
    Abstract: According to the invention, a system for presenting graphics on a display device is disclosed. The system may include an eye tracking device and a graphics processing device. The eye tracking device may be for determining a gaze point of a user on a display device. The graphics processing device may be for causing graphics to be displayed on the display device. The graphics displayed on the display device may be modified such that the graphics are of higher quality in an area including the gaze point of the user, than outside the area.
    Type: Grant
    Filed: September 20, 2016
    Date of Patent: February 20, 2018
    Assignee: Tobii AB
    Inventors: Robin Thunström, Mårten Skogö
  • Publication number: 20180032103
    Abstract: A wearable device is disclosed. The wearable device may include a display, a lens, an illuminator, an image sensor, and at least one processor. The display may be configured to present images. The lens may be configured to provide a view of the display to an eye of a user. The illuminator may be configured to illuminate the eye. The image sensor may be configured to detect illumination reflected by the eye. The at least one processor may be configured to cause the display to present an image, receive data from the image sensor, determine a gaze direction of the eye based at least in part on the data from the image sensor, and cause at least one of the display or the lens to be moved until the gaze direction is directed toward the image.
    Type: Application
    Filed: July 27, 2017
    Publication date: February 1, 2018
    Applicant: TOBII AB
    Inventors: Henrik Eskilsson, Mårten Skogö
  • Patent number: 9870051
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: August 31, 2017
    Date of Patent: January 16, 2018
    Assignee: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Patent number: 9864498
    Abstract: Automatic scrolling of content displayed on a display device in response to gaze detection. Content may be displayed in a window rendered on a display screen. Gaze detection components may be used to detect that a user is gazing at the content and to determine a gaze point. At least one scroll zone relative to the display screen and a scroll action associated with each scroll zone may be determined. In response to determining that the gaze point is within a scroll zone, an associated scroll action may be initiated. The scroll action causes the content to scroll within the window until at least one of: expiration of a defined period, determining that a portion of the content scrolls past a defined position within the window, determining that the gaze point is outside of the scroll zone, and detecting an indicator that the user begins reading the content.
    Type: Grant
    Filed: March 13, 2013
    Date of Patent: January 9, 2018
    Assignee: Tobii AB
    Inventors: Anders Olsson, Mårten Skogö