Patents by Inventor Mårten Skogö
Mårten Skogö has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10198070Abstract: A gaze tracking system, leaving a low power mode in response to an activation signal, captures an initial burst of eye pictures in short time by restricting the image area of a sensor, with the purpose of enabling an increased frame rate. Subsequent eye pictures are captured at normal frame rate. The first gaze point value is computed memorylessly based on the initial burst of eye pictures and no additional imagery, while subsequent values may be computed recursively by taking into account previous gaze point values or information from previous eye pictures. The restriction of the image area may be guided by a preliminary overview picture captured using the same or a different sensor. From the gaze point values, the system may derive a control signal to be supplied to a computer device with a visual display.Type: GrantFiled: September 18, 2017Date of Patent: February 5, 2019Assignee: Tobii ABInventors: Mårten Skogö, Anders Olsson, John Mikael Elvesjö, Aron Yu
-
Patent number: 10192109Abstract: According to the invention a system for authenticating a user of a device is disclosed. The system may include a first image sensor, a determination unit, and an authentication unit. The first image sensor may be for capturing at least one image of at least part of a user. The determination unit may be for determining information relating to the user's eye based at least in part on at least one image captured by the first image sensor. The authentication unit may be for authenticating the user using the information relating to the user's eye.Type: GrantFiled: April 18, 2016Date of Patent: January 29, 2019Assignee: Tobii ABInventors: Mårten Skogö, Richard Hainzl, Henrik Jönsson, Anders Vennström, Erland George-Svahn, John Elvesjö
-
Publication number: 20190025932Abstract: A portable computing device is disclosed which may include a base element, a lid element, a first motion sensor, a second motion sensor, a processor, and an eye tracking system. The first motion sensor may be disposed in the base element. The second motion sensor may be disposed in the lid element. The processor may be configured to control the first motion sensor to detect first motion information, control the second motion sensor to detect second motion information; and determine final motion information based at least in part on the first motion information and the second motion information. The eye tracking system may be configured to determine a gaze position of a user based at least in part on the final motion information, wherein the processor is further configured to execute one or more control processes based at least in part on the determined gaze position meeting a predetermined condition.Type: ApplicationFiled: July 30, 2018Publication date: January 24, 2019Applicant: Tobii ABInventors: Mårten Skogö, John Elvesjö, Jan-Erik Lundkvist, Per Lundberg
-
Publication number: 20190026369Abstract: The invention generally relates to computer implemented systems and methods for utilizing detection of eye movements in connection with interactive graphical user interfaces and, in particular, for utilizing eye tracking data to facilitate and improve information search and presentation in connection with interactive graphical user interfaces. A gaze search index is determined based on information that has been presented on an information presentation area and gaze data signals. The gaze search index comprises links between gaze search parameters and presented information that satisfies gaze filter criteria for respective gaze search parameter. A user can initiate query searches for information on the computer device or on information hosts connectable to the computer device via networks by using combinations of gaze search parameters of the gaze search index and non gaze search parameters of a non gaze search index.Type: ApplicationFiled: August 1, 2018Publication date: January 24, 2019Applicant: Tobii ABInventors: Anders Olsson, Mårten Skogö
-
Patent number: 10185394Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: GrantFiled: December 11, 2017Date of Patent: January 22, 2019Assignee: Tobii ABInventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Publication number: 20190011986Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: ApplicationFiled: September 12, 2018Publication date: January 10, 2019Applicant: Tobii ABInventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Publication number: 20180364802Abstract: A control module for generating gesture based commands during user interaction with an information presentation area is provided. The control module is configured to acquire user input from a touchpad and gaze data signals from a gaze tracking module; and determine at least one user generated gesture based control command based on a user removing contact of a finger of the user with the touchpad; determine a gaze point area on the information presentation area including the user's gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on the graphical information presentation area based on the determined gaze point area and at least one user generated gesture based control command, wherein the user action is executed at said determined gaze point area.Type: ApplicationFiled: June 14, 2018Publication date: December 20, 2018Applicant: Tobii ABInventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö, Erland George-Svahn
-
Publication number: 20180349698Abstract: A system for determining a gaze direction of a user is disclosed. The system may include a first illuminator, a first profile sensor, and at least one processor. The first illuminator may be configured to illuminate an eye of a user. The first profile sensor may be configured to detect light reflected by the eye of the user. The processor(s) may be configured to determine a gaze direction of the user based at least in part on light detected by the first profile sensor.Type: ApplicationFiled: May 18, 2018Publication date: December 6, 2018Applicant: Tobii ABInventors: Simon Gustafsson, Daniel Johansson Tornéus, Jonas Andersson, Johannes Kron, Mårten Skogö, Ralf Biedert
-
Publication number: 20180335838Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.Type: ApplicationFiled: May 21, 2018Publication date: November 22, 2018Applicant: Tobii ABInventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö
-
Publication number: 20180329510Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the at least one GUI-component in accordance with each respective control signal request.Type: ApplicationFiled: May 8, 2018Publication date: November 15, 2018Applicant: Tobii ABInventors: Christoffer Björklund, Henrik Eskilsson, Magnus Jacobson, Mårten Skogö
-
Patent number: 10114459Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: GrantFiled: November 10, 2017Date of Patent: October 30, 2018Assignee: Tobii ABInventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Patent number: 10116846Abstract: According to the invention, a system for converting sound to electrical signals is disclosed. The system may include a gaze tracking device and a microphone. The gaze tracking device may determine a gaze direction of a user. The microphone may be more sensitive in a selected direction than at least one other direction and alter the selected direction based at least in part on the gaze direction determined by the gaze tracking device.Type: GrantFiled: March 8, 2017Date of Patent: October 30, 2018Assignee: Tobii ABInventors: David Figgins Henderek, Mårten Skogö
-
Publication number: 20180307324Abstract: A computer apparatus is associated with a graphical display presenting at least one GUI-component adapted to be manipulated based on user-generated commands. An event engine is adapted to receive an eye-tracking data signal that describes a user's point of regard on the display. Based on the signal, the event engine produces a set of non-cursor controlling event output signals, which influence the at least one GUI-component. Each non-cursor controlling event output signal describes a particular aspect of the user's ocular activity in respect of the display. Initially, the event engine receives a control signal request from each of the at least one GUI-component. The control signal request defines a sub-set of the set of non-cursor controlling event output signals which is required by the particular GUI-component. The event engine delivers non-cursor controlling event output signals to the, at least one GUI-component in accordance with each respective control signal request.Type: ApplicationFiled: April 23, 2018Publication date: October 25, 2018Applicant: Tobii ABInventors: Christoffer Björklund, Henrik Eskilsson, Magnus Jacobson, Mårten Skogö
-
Publication number: 20180270436Abstract: Techniques for reducing a read out time and power consumption of an image sensor used for eye tracking are described. In an example, a position of an eye element in an active area of a sensor is determined. The eye element can be any of an eye, a pupil of the eye, an iris of the eye, or a glint at the eye. A region of interest (ROI) around the position of the eye is defined. The image sensor reads out pixels confined to the ROI, thereby generating an ROI image that shows the eye element.Type: ApplicationFiled: May 10, 2018Publication date: September 20, 2018Applicant: Tobii ABInventors: Magnus Ivarsson, Per-Edvin Stoltz, David Masko, Niklas Ollesson, Mårten Skogö, Peter Blixt, Henrik Jönsson
-
Publication number: 20180242841Abstract: A subject is illuminated whose movements of at least one eye are to be registered by an eye-tracker for producing resulting data. To this aim, a coherent light source in light producing means produces at least one well-defined light beam. A diffractive optical element is arranged between the coherent light source and an output from the light producing means. The diffractive optical element is configured to direct a light beam towards at least a first designated area estimated to cover the at least one eye of the subject. Image registering means registers light from the at least one light beam having been reflected against the subject. The resulting data is produced in response to these light reflections, which resulting data represents movements of the at least one eye of the subject.Type: ApplicationFiled: January 10, 2018Publication date: August 30, 2018Applicant: Tobii ABInventors: Richard Hainzl, Mattias Kuldkepp, Peter Blixt, Mårten Skogö
-
Publication number: 20180239412Abstract: A personal computer system comprises a visual display, an imaging device adapted to provide eye-tracking data by imaging at least one eye of a viewer of the visual display, and identifying means for recognizing the viewer with reference to one of a plurality of predefined personal profiles. The personal computer system further comprises an eye-tracking processor for processing the eye-tracking data. According to the invention, the eye-tracking processor is selectively operable in one of a plurality of personalized active sub-modes associated with said personal profiles. The sub-modes may differ with regard to eye-tracking related or power-management related settings. Further, the identifying means may sense an identified viewer's actual viewing condition (e.g., use of viewing aids or wearing of garments), wherein the imaging device is further operable in a sub-profile mode associated with the determined actual viewing condition.Type: ApplicationFiled: October 16, 2017Publication date: August 23, 2018Applicant: Tobii ABInventors: John Mikael Elvesjö, Anders Kingbäck, Gunnar Troili, Mårten Skogö, Henrik Eskilsson, Peter Blixt
-
Patent number: 10055495Abstract: The invention generally relates to computer implemented systems and methods for utilizing detection of eye movements in connection with interactive graphical user interfaces and, in particular, for utilizing eye tracking data to facilitate and improve information search and presentation in connection with interactive graphical user interfaces. A gaze search index is determined based on information that has been presented on an information presentation area and gaze data signals. The gaze search index comprises links between gaze search parameters and presented information that satisfies gaze filter criteria for respective gaze search parameter. A user can initiate query searches for information on the computer device or on information hosts connectable to the computer device via networks by using combinations of gaze search parameters of the gaze search index and non gaze search parameters of a non gaze search index.Type: GrantFiled: October 29, 2012Date of Patent: August 21, 2018Assignee: Tobii ABInventors: Anders Olsson, Mårten Skogö
-
Publication number: 20180232049Abstract: According to the invention, a system for presenting graphics on a display device is disclosed. The system may include an eye tracking device and a graphics processing device. The eye tracking device may be for determining a gaze point of a user on a display device. The graphics processing device may be for causing graphics to be displayed on the display device. The graphics displayed on the display device may be modified such that the graphics are of higher quality in an area including the gaze point of the user, than outside the area.Type: ApplicationFiled: January 8, 2018Publication date: August 16, 2018Applicant: Tobii ABInventors: Robin Thunström, Mårten Skogö
-
Publication number: 20180224933Abstract: According to the invention, a method for changing information on a display in a vehicle based on a gaze direction of a driver is disclosed. The method may include displaying information on the display in the vehicle. The method may also include receiving gaze data indicative of the gaze direction of a user. The method may further include changing the display based at least in part on the gaze data.Type: ApplicationFiled: November 10, 2017Publication date: August 9, 2018Applicant: Tobii ABInventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Patent number: 10037086Abstract: A portable computing device is disclosed which may include a base element, a lid element, a first motion sensor, a second motion sensor, a processor, and an eye tracking system. The first motion sensor may be disposed in the base element. The second motion sensor may be disposed in the lid element. The processor may be configured to control the first motion sensor to detect first motion information, control the second motion sensor to detect second motion information; and determine final motion information based at least in part on the first motion information and the second motion information. The eye tracking system may be configured to determine a gaze position of a user based at least in part on the final motion information, wherein the processor is further configured to execute one or more control processes based at least in part on the determined gaze position meeting a predetermined condition.Type: GrantFiled: June 30, 2017Date of Patent: July 31, 2018Assignee: Tobii ABInventors: Mårten Skogö, John Elvesjö, Jan-Erik Lundkvist, Per Lundberg