Patents by Inventor Marten Skogoe

Marten Skogoe has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20150177833
    Abstract: There is provided methods, systems and computer program products for identifying an interaction element from one or more interaction elements present in a user interface, comprising: receiving gaze information from an eye tracking system; determining the likelihood that an interaction element is the subject of the gaze information from the eye tracking system; and identifying an interaction element from the one or more interaction elements based on said likelihood determination.
    Type: Application
    Filed: December 23, 2014
    Publication date: June 25, 2015
    Inventors: Anders VENNSTRÖM, Mårten SKOGÖ, John ELVESJÖ
  • Publication number: 20150145769
    Abstract: An imaging device adapted to provide eye-tracking data by imaging at least one eye of a viewer, wherein: the imaging device is switchable between at least an active mode and a ready mode; the imaging device is configured to use active eye illumination in the active mode, which enables tracking of a corneal reflection; and the imaging device is configured, in the ready mode, to reduce an illumination intensity from a value the illumination intensity has in the active mode, and to provide eye-tracking data which include eye position but not eye orientation.
    Type: Application
    Filed: January 28, 2015
    Publication date: May 28, 2015
    Applicant: TOBII TECHNOLOGY AB
    Inventors: Henrik ESKILSSON, Mårten Skogö, John Elvesjö, Peter Blixt
  • Publication number: 20150130740
    Abstract: A control module for generating gesture based commands during user interaction with an information presentation area is provided. The control module is configured to acquire user input from a touchpad and gaze data signals from a gaze tracking module; and determine at least one user generated gesture based control command based on a user removing contact of a finger of the user with the touchpad; determine a gaze point area on the information presentation area including the user's gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on the graphical information presentation area based on the determined gaze point area and at least one user generated gesture based control command, wherein the user action is executed at said determined gaze point area.
    Type: Application
    Filed: January 20, 2015
    Publication date: May 14, 2015
    Applicant: TOBII TECHNOLOGY AB
    Inventors: Markus CEDERLUND, Robert GAVELIN, Anders ENNSTRÖM, Anders KAPLAN, Anders OLSSON, Mårten SKOGÖ, Erland GEORGE-SVAHN
  • Publication number: 20150092163
    Abstract: An eye tracker comprising at least one illuminator for illuminating an eye, at least two cameras for imaging the eye and a controller is disclosed. The configuration of the reference illuminator(s) and cameras is such that, at least one camera is coaxial with a reference illuminator and at least one camera is non-coaxial with a reference illuminator. On the one hand, the controller is adapted to select one of the cameras to be active with the aim of maximising an image quality metric and avoiding obscuring objects. On the other hand, the eye tracker is operable in a dual-camera mode to improve accuracy. The invention further provides a method and computer-program product for selecting a combination of an active reference illuminator from a plurality of reference illuminators and an active camera from a plurality of cameras.
    Type: Application
    Filed: December 8, 2014
    Publication date: April 2, 2015
    Inventors: Peter Blixt, Gunnar Troili, Anders Kingbäck, John Mikael Elvesjö, Mårten Skogö
  • Patent number: 8976110
    Abstract: A personal computer system includes a visual display, an imaging device adapted to provide eye-tracking data by imaging the face of a viewer of the visual display, and an input device for accepting eye-tracking control data and other input data. The imaging device is switchable between at least an active mode, a ready mode and an idle mode, and the switching time from the idle mode to the active mode is longer than the switching time from the ready mode to the active mode. The eye-tracking data provided in the ready mode may include eye position but not eye orientation. The system may include a dedicated input device for forcing the imaging device into active mode by directly triggering an interrupt; alternatively, the dedicated input device forces it permanently into idle mode.
    Type: Grant
    Filed: October 27, 2011
    Date of Patent: March 10, 2015
    Assignee: Tobii Technology AB
    Inventors: Henrik Eskilsson, Mårten Skogö, John Elvesjö, Peter Blixt
  • Publication number: 20150063603
    Abstract: According to the invention, a system for converting sound to electrical signals is disclosed. The system may include a gaze tracking device and a microphone. The gaze tracking device may determine a gaze direction of a user. The microphone may be more sensitive in a selected direction than at least one other direction and alter the selected direction based at least in part on the gaze direction determined by the gaze tracking device.
    Type: Application
    Filed: August 5, 2014
    Publication date: March 5, 2015
    Inventors: David Figgins Henderek, Mårten Skogö
  • Patent number: 8944600
    Abstract: An eye tracker includes at least one illuminator for illuminating an eye, at least two cameras for imaging the eye and a controller. The configuration of the reference illuminator and cameras is such that, at least one camera is coaxial with a reference illuminator and at least one camera is non-coaxial with a reference illuminator. The controller selects one of the cameras to be active to increase an image quality metric and avoid obscuring objects. The eye tracker is operable in a dual-camera mode to improve accuracy.
    Type: Grant
    Filed: December 17, 2012
    Date of Patent: February 3, 2015
    Assignee: Tobii Technology AB
    Inventors: Peter Blixt, Gunnar Troili, Anders Kingback, John Mikael Elvesjö, Mårten Skogö
  • Publication number: 20140354539
    Abstract: A personal computer system provides a gaze-controlled graphical user interface having a bidirectional and a unidirectional interaction mode. In the bidirectional interaction mode, a display shows one or more graphical controls in motion, each being associated with an input operation to an operating system. A gaze tracking system provides gaze point data of a viewer, and a matching module attempts to match a relative gaze movement against a relative movement of one of the graphical controls. The system includes a selector which is preferably controllable by a modality other than gaze. The system initiates a transition from the unidirectional interaction mode to the bidirectional interaction mode in response to an input received at the selector. The display then shows graphical controls in motion in a neighbourhood of the current gaze point, as determined based on current gaze data.
    Type: Application
    Filed: May 29, 2014
    Publication date: December 4, 2014
    Applicant: Tobii Technology AB
    Inventors: Mårten SKOGÖ, John Mikael ELVESJÖ, Anders OLSSON
  • Publication number: 20140313129
    Abstract: A personal computer system comprises a visual display, an imaging device adapted to provide eye-tracking data by imaging at least one eye of a viewer of the visual display, and identifying means for recognizing the viewer with reference to one of a plurality of predefined personal profiles. The personal computer system further comprises an eye-tracking processor for processing the eye-tracking data. According to the invention, the eye-tracking processor is selectively operable in one of a plurality of personalized active sub-modes associated with said personal profiles. The sub-modes may differ with regard to eye-tracking related or power-management related settings. Further, the identifying means may sense an identified viewer's actual viewing condition (e.g., use of viewing aids or wearing of garments), wherein the imaging device is further operable in a sub profile mode associated with the determined actual viewing condition.
    Type: Application
    Filed: October 26, 2012
    Publication date: October 23, 2014
    Inventors: John Mikael Elvesjö, Anders Kingbäck, Gunnar Troili, Mårten Skogö, Henrik Eskilsson, Peter Blixt
  • Publication number: 20140310256
    Abstract: The invention generally relates to computer implemented systems and methods for utilizing detection of eye movements in connection with interactive graphical user interfaces and, in particular, for utilizing eye tracking data to facilitate and improve information search and presentation in connection with interactive graphical user interfaces. A gaze search index is determined based on information that has been presented on an information presentation area and gaze data signals. The gaze search index comprises links between gaze search parameters and presented information that satisfies gaze filter criteria for respective gaze search parameter. A user can initiate query searches for information on the computer device or on information hosts connectable to the computer device via networks by using combinations of gaze search parameters of the gaze search index and non gaze search parameters of a non gaze search index.
    Type: Application
    Filed: October 29, 2012
    Publication date: October 16, 2014
    Applicant: TOBII TECHNOLOGY AB
    Inventors: Anders Olsson, Mårten Skogö
  • Publication number: 20140293226
    Abstract: A subject (180) is illuminated whose movements of at least one eye are to be registered by an eye-tracker for producing resulting data (DEYE). To this aim, a coherent light source in light producing means (140) produces at least one well-defined light beam (L1, L2). A diffractive optical element is arranged between the coherent light source and an output from the light producing means (140). The diffractive optical element is configured to direct a light beam (L1) towards at least a first designated area (A1) estimated to cover the at least one eye of the subject (180). Image registering means (150) registers light from the at least one light beam (L1, L2) having been reflected against the subject (180). The resulting data (DEYE) are produced in response to these light reflections, which resulting data (DEYE) represent movements of the at least one eye of the subject (180).
    Type: Application
    Filed: June 14, 2012
    Publication date: October 2, 2014
    Applicant: TOBII TECHNOLOGY AB
    Inventors: Richard Hainzl, Mattias Kuldkepp, Peter Blixt, Mårten Skogö
  • Publication number: 20140268054
    Abstract: Disclosed are various embodiments for automatic scrolling of content displayed on a display device in response to gaze detection. Content may be displayed in a window rendered on a display screen. Gaze detection components may be used to detect that a user is gazing at the displayed content and to determine a gaze point relative to the display screen. At least one applicable scroll zone relative to the display screen and a scroll action associated with each applicable scroll zone may be determined. In response to determining that the gaze point is within a first applicable scroll zone, an associated first scroll action may be initiated. The first scroll action causes the content to scroll within the window until at least one of: expiration of a defined period, determining that a portion of the content scrolls past a defined position within the window, determining that the gaze point is outside of the first scroll zone, and detecting an indicator that the user begins reading the content.
    Type: Application
    Filed: March 13, 2013
    Publication date: September 18, 2014
    Inventors: Anders Olsson, Mårten Skogö
  • Publication number: 20140247232
    Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A computer can enlarge a portion of a display adjacent a first gaze target in response to detecting a first action (e.g., pressing a touchpad). The computer can then allow a user to position a second gaze target in the enlarged portion (e.g., by looking at the desired location) and perform a second action in order to perform a computer function at that location. The enlarging can allow a user to identify a desired location for a computer function (e.g., selecting an icon) with greater precision.
    Type: Application
    Filed: March 3, 2014
    Publication date: September 4, 2014
    Applicant: TOBII TECHNOLOGY AB
    Inventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
  • Publication number: 20140247208
    Abstract: Waking a computing device from a stand-by mode may include determining a wake zone relative to a display device and, when the computing device is in stand-by mode, detecting a gaze point relative to the display device. In response to determining that the gaze point is within the wake zone, a wake command is generated and passed to a program module, such as the operating system, to cause the program module to wake the computing device from the stand-by mode. When the computing device is not in stand-by mode, another gaze point may be detected and, in response to determining that the other gaze point is within the vicinity of a selectable stand-by icon, the stand-by command is generated and passed to the program module to cause the program module to place the computing device into the stand-by mode.
    Type: Application
    Filed: May 14, 2013
    Publication date: September 4, 2014
    Applicant: Tobii Technology AB
    Inventors: David Henderek, Mårten Skogö, John Elvesjö
  • Publication number: 20140247215
    Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A visual indicator can be presented on a display to indicate the location where a computer function will take place (e.g., a common cursor). The visual indicator can be moved to a gaze target in response to continued detection of an action (e.g., touchpad touch) by a user for a predetermined period of time. The delay between the action and the movement of the visual indicator can allow a user time to “abort” movement of the visual indicator. Additionally, once the visual indicator has moved, the visual indicator can be controlled with additional precision as the user moves gaze while continuing the action (e.g., continued holding of the touchpad).
    Type: Application
    Filed: March 3, 2014
    Publication date: September 4, 2014
    Applicant: Tobii Technology AB
    Inventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
  • Publication number: 20140043227
    Abstract: A gaze tracking system, leaving a low power mode in response to an activation signal, captures an initial burst of eye pictures in short time by restricting the image area of a sensor, with the purpose of enabling an increased frame rate. Subsequent eye pictures are captured at normal frame rate. The first gaze point value is computed memorylessly based on the initial burst of eye pictures and no additional imagery, while subsequent values may be computed recursively by taking into account previous gaze point values or information from previous eye pictures. The restriction of the image area may be guided by a preliminary overview picture captured using the same or a different sensor. From the gaze point values, the system may derive a control signal to be supplied to a computer device with a visual display.
    Type: Application
    Filed: August 8, 2013
    Publication date: February 13, 2014
    Applicant: Tobii Technology AB
    Inventors: Mårten SKOGÖ, Anders OLSSON, John Mikael ELVESJÖ, Aron YU
  • Publication number: 20140009390
    Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.
    Type: Application
    Filed: August 6, 2013
    Publication date: January 9, 2014
    Applicant: Tobii Technology AB
    Inventors: Christoffer BJÖRKLUND, Henrik ESKILSSON, Magnus JACOBSON, Mårten SKOGÖ
  • Publication number: 20130321270
    Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.
    Type: Application
    Filed: August 6, 2013
    Publication date: December 5, 2013
    Applicant: TOBII TECHNOLOGY AB
    Inventors: Christoffer BJÖRKLUND, Henrik ESKILSSON, Magnus JACOBSON, Mårten SKOGÖ
  • Publication number: 20130326431
    Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.
    Type: Application
    Filed: August 6, 2013
    Publication date: December 5, 2013
    Applicant: TOBII TECHNOLOGY AB
    Inventors: Christoffer BJÖRKLUND, Henrik ESKILSSON, Magnus JACOBSON, Mårten SKOGÖ
  • Publication number: 20130318457
    Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.
    Type: Application
    Filed: August 6, 2013
    Publication date: November 28, 2013
    Applicant: TOBII TECHNOLOGY AB
    Inventors: Christoffer Björklund, Henrik Eskilsson, Magnus Jacobson, Mårten Skogö