Patents by Inventor Marten Skogo

Marten Skogo has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20180329510
    Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.
    Type: Application
    Filed: May 8, 2018
    Publication date: November 15, 2018
    Applicant: Tobii AB
    Inventors: Christoffer Björklund, Henrik Eskilsson, Magnus Jacobson, Mårten Skogö
  • Patent number: 10114459
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: November 10, 2017
    Date of Patent: October 30, 2018
    Assignee: Tobii AB
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Patent number: 10116846
    Abstract: According to the invention, a system for converting sound to electrical signals is disclosed. The system may include a gaze tracking device and a microphone. The gaze tracking device may determine a gaze direction of a user. The microphone may be more sensitive in a selected direction than at least one other direction and alter the selected direction based at least in part on the gaze direction determined by the gaze tracking device.
    Type: Grant
    Filed: March 8, 2017
    Date of Patent: October 30, 2018
    Assignee: Tobii AB
    Inventors: David Figgins Henderek, Mårten Skogö
  • Publication number: 20180307324
    Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the, at least one GUI-component in accordance with each respective control signal request.
    Type: Application
    Filed: April 23, 2018
    Publication date: October 25, 2018
    Applicant: Tobii AB
    Inventors: Christoffer Björklund, Henrik Eskilsson, Magnus Jacobson, Mårten Skogö
  • Patent number: 10082870
    Abstract: According to the invention, a system for presenting graphics on a display device is disclosed. The system may include an eye tracking device for determining a gaze point of a user on a display device. The system may also include a graphics processing device for causing graphics to be displayed on the display device. The graphics displayed on the display device may be modified such that the graphics in an area including the gaze point of the user have at least one modified parameter relative to graphics outside the area. The size of the area may be based at least in part on an amount of noise in the gaze point over time and at least one other secondary factor.
    Type: Grant
    Filed: March 31, 2017
    Date of Patent: September 25, 2018
    Assignee: Tobii AB
    Inventors: Robin Thunström, Märten Skogö, Denny Rönngren, Anders Clausen
  • Publication number: 20180270436
    Abstract: Techniques for reducing a read out time and power consumption of an image sensor used for eye tracking are described. In an example, a position of an eye element in an active area of a sensor is determined. The eye element can be any of an eye, a pupil of the eye, an iris of the eye, or a glint at the eye. A region of interest (ROI) around the position of the eye is defined. The image sensor reads out pixels confined to the ROI, thereby generating an ROI image that shows the eye element.
    Type: Application
    Filed: May 10, 2018
    Publication date: September 20, 2018
    Applicant: Tobii AB
    Inventors: Magnus Ivarsson, Per-Edvin Stoltz, David Masko, Niklas Ollesson, Mårten Skogö, Peter Blixt, Henrik Jönsson
  • Publication number: 20180242841
    Abstract: A subject is illuminated whose movements of at least one eye are to be registered by an eye-tracker for producing resulting data. To this aim, a coherent light source in light producing means produces at least one well-defined light beam. A diffractive optical element is arranged between the coherent light source and an output from the light producing means. The diffractive optical element is configured to direct a light beam towards at least a first designated area estimated to cover the at least one eye of the subject. Image registering means registers light from the at least one light beam having been reflected against the subject. The resulting data is produced in response to these light reflections, which resulting data represents movements of the at least one eye of the subject.
    Type: Application
    Filed: January 10, 2018
    Publication date: August 30, 2018
    Applicant: Tobii AB
    Inventors: Richard Hainzl, Mattias Kuldkepp, Peter Blixt, Mårten Skogö
  • Publication number: 20180239412
    Abstract: A personal computer system comprises a visual display, an imaging device adapted to provide eye-tracking data by imaging at least one eye of a viewer of the visual display, and identifying means for recognizing the viewer with reference to one of a plurality of predefined personal profiles. The personal computer system further comprises an eye-tracking processor for processing the eye-tracking data. According to the invention, the eye-tracking processor is selectively operable in one of a plurality of personalized active sub-modes associated with said personal profiles. The sub-modes may differ with regard to eye-tracking related or power-management related settings. Further, the identifying means may sense an identified viewer's actual viewing condition (e.g., use of viewing aids or wearing of garments), wherein the imaging device is further operable in a sub-profile mode associated with the determined actual viewing condition.
    Type: Application
    Filed: October 16, 2017
    Publication date: August 23, 2018
    Applicant: Tobii AB
    Inventors: John Mikael Elvesjö, Anders Kingbäck, Gunnar Troili, Mårten Skogö, Henrik Eskilsson, Peter Blixt
  • Patent number: 10055495
    Abstract: The invention generally relates to computer implemented systems and methods for utilizing detection of eye movements in connection with interactive graphical user interfaces and, in particular, for utilizing eye tracking data to facilitate and improve information search and presentation in connection with interactive graphical user interfaces. A gaze search index is determined based on information that has been presented on an information presentation area and gaze data signals. The gaze search index comprises links between gaze search parameters and presented information that satisfies gaze filter criteria for respective gaze search parameter. A user can initiate query searches for information on the computer device or on information hosts connectable to the computer device via networks by using combinations of gaze search parameters of the gaze search index and non gaze search parameters of a non gaze search index.
    Type: Grant
    Filed: October 29, 2012
    Date of Patent: August 21, 2018
    Assignee: Tobii AB
    Inventors: Anders Olsson, Mårten Skogö
  • Publication number: 20180232049
    Abstract: According to the invention, a system for presenting graphics on a display device is disclosed. The system may include an eye tracking device and a graphics processing device. The eye tracking device may be for determining a gaze point of a user on a display device. The graphics processing device may be for causing graphics to be displayed on the display device. The graphics displayed on the display device may be modified such that the graphics are of higher quality in an area including the gaze point of the user, than outside the area.
    Type: Application
    Filed: January 8, 2018
    Publication date: August 16, 2018
    Applicant: Tobii AB
    Inventors: Robin Thunström, Mårten Skogö
  • Publication number: 20180232575
    Abstract: A method of presenting gaze-point data of a subject detected by an eye-tracking unit includes presenting a test scene picture acquired by a camera unit, and displaying shapes on the test scene picture. The shapes represent momentary gaze points of the subject.
    Type: Application
    Filed: October 2, 2017
    Publication date: August 16, 2018
    Applicant: Tobii AB
    Inventors: Johan Strombom, Marten Skogo, Per Nystedt, Simon Gustafsson, John Mikael Holtz Elvesjo, Peter Blixt
  • Publication number: 20180224933
    Abstract: According to the invention, a method for changing information on a display in a vehicle based on a gaze direction of a driver is disclosed. The method may include displaying information on the display in the vehicle. The method may also include receiving gaze data indicative of the gaze direction of a user. The method may further include changing the display based at least in part on the gaze data.
    Type: Application
    Filed: November 10, 2017
    Publication date: August 9, 2018
    Applicant: Tobii AB
    Inventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
  • Patent number: 10037086
    Abstract: A portable computing device is disclosed which may include a base element, a lid element, a first motion sensor, a second motion sensor, a processor, and an eye tracking system. The first motion sensor may be disposed in the base element. The second motion sensor may be disposed in the lid element. The processor may be configured to control the first motion sensor to detect first motion information, control the second motion sensor to detect second motion information; and determine final motion information based at least in part on the first motion information and the second motion information. The eye tracking system may be configured to determine a gaze position of a user based at least in part on the final motion information, wherein the processor is further configured to execute one or more control processes based at least in part on the determined gaze position meeting a predetermined condition.
    Type: Grant
    Filed: June 30, 2017
    Date of Patent: July 31, 2018
    Assignee: Tobii AB
    Inventors: Mårten Skogö, John Elvesjö, Jan-Erik Lundkvist, Per Lundberg
  • Patent number: 10035518
    Abstract: According to the invention, a method for modifying operation of at least one system of a vehicle based at least in part on a gaze direction of a driver is disclosed. The method may include receiving gaze data indicative of the gaze direction of a driver of a vehicle. The method may also include modifying operation of at least one system of the vehicle based at least in part on the gaze data.
    Type: Grant
    Filed: February 23, 2017
    Date of Patent: July 31, 2018
    Assignee: Tobii AB
    Inventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
  • Patent number: 10025381
    Abstract: A control module for generating gesture based commands during user interaction with an information presentation area is provided. The control module is configured to acquire user input from a touchpad and gaze data signals from a gaze tracking module; and determine at least one user generated gesture based control command based on a user removing contact of a finger of the user with the touchpad; determine a gaze point area on the information presentation area including the user's gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on the graphical information presentation area based on the determined gaze point area and at least one user generated gesture based control command, wherein the user action is executed at said determined gaze point area.
    Type: Grant
    Filed: January 20, 2015
    Date of Patent: July 17, 2018
    Assignee: Tobii AB
    Inventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö, Erland George-Svahn
  • Patent number: 10025389
    Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.
    Type: Grant
    Filed: December 22, 2011
    Date of Patent: July 17, 2018
    Assignee: Tobii AB
    Inventors: Christoffer Björklund, Henrik Eskilsson, Magnus Jacobson, Mårten Skogö
  • Patent number: 10013053
    Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.
    Type: Grant
    Filed: October 5, 2012
    Date of Patent: July 3, 2018
    Assignee: Tobii AB
    Inventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö
  • Publication number: 20180181272
    Abstract: Disclosed are various embodiments for automatic scrolling of content displayed on a display device in response to gaze detection. Content may be displayed in a window rendered on a display screen. Gaze detection components may be used to detect that a user is gazing at the displayed content and to determine a gaze point relative to the display screen. At least one applicable scroll zone relative to the display screen and a scroll action associated with each applicable scroll zone may be determined. In response to determining that the gaze point is within a first applicable scroll zone, an associated first scroll action may be initiated. The first scroll action causes the content to scroll within the window until at least one of: expiration of a defined period, determining that a portion of the content scrolls past a defined position within the window, determining that the gaze point is outside of the first scroll zone, and detecting an indicator that the user begins reading the content.
    Type: Application
    Filed: December 21, 2017
    Publication date: June 28, 2018
    Applicant: Tobii AB
    Inventors: Anders Olsson, Mårten Skogö
  • Patent number: 9996159
    Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.
    Type: Grant
    Filed: August 6, 2013
    Date of Patent: June 12, 2018
    Assignee: Tobii AB
    Inventors: Christoffer Björklund, Henrik Eskilsson, Magnus Jacobson, Mårten Skogö
  • Publication number: 20180157323
    Abstract: There is provided methods, systems and computer program products for identifying an interaction element from one or more interaction elements present in a user interface, comprising: receiving gaze information from an eye tracking system; determining the likelihood that an interaction element is the subject of the gaze information from the eye tracking system; and identifying an interaction element from the one or more interaction elements based on said likelihood determination.
    Type: Application
    Filed: November 27, 2017
    Publication date: June 7, 2018
    Applicant: Tobii AB
    Inventors: Anders Vennström, Mårten Skogö, John Elvesjö