Patents by Inventor Marten Skogoe
Marten Skogoe has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20170177079Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A visual indicator can be presented on a display to indicate the location where a computer function will take place (e.g., a common cursor). The visual indicator can be moved to a gaze target in response to continued detection of an action (e.g., touchpad touch) by a user for a predetermined period of time. A delay between the action and the movement of the visual indicator can allow a user time to “abort” movement of the visual indicator. Additionally, once the visual indicator has moved, the visual indicator can be controlled with additional precision as the user moves the gaze while continuing the action (e.g., continued holding of the touchpad).Type: ApplicationFiled: March 3, 2017Publication date: June 22, 2017Applicant: Tobii ABInventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
-
Patent number: 9665171Abstract: According to the invention, a system for presenting graphics on a display device is disclosed. The system may include an eye tracking device, at least one processor, and a graphics processing device. The eye tracking device may be for determining at least one of a gaze point of a user on a display device, or a change in the gaze point of the user on the display device. The processor may be for receiving data from the eye tracking device and modifying use of at least one system resource based at least in part on the data received from the eye tracking device. The graphics processing device may be for causing an image to be displayed on the display device.Type: GrantFiled: March 4, 2014Date of Patent: May 30, 2017Assignee: Tobii ABInventors: Mårten Skogö, Anders Olsson, Erik Kenneth Holmgren
-
Patent number: 9649029Abstract: A method includes obtaining an image of an eye in a bright-pupil imaging mode in which a retinal retro-reflection complements the image of the eye in the bright-pupil mode, and obtaining an image of the eye in a dark-pupil imaging mode in which a cornea-scleral retro-reflection complements the image of the eye in the dark-pupil mode. One of the bright-pupil imaging mode and the dark-pupil imaging mode is selected based on quality of obtained images.Type: GrantFiled: December 8, 2014Date of Patent: May 16, 2017Assignee: Tobii ABInventors: Peter Blixt, Gunnar Troili, Anders Kingbäck, John Mikael Elvesjö, Mårten Skogö
-
Publication number: 20170109513Abstract: According to the invention a system for authenticating a user of a device is disclosed. The system may include a first image sensor, a determination unit, and an authentication unit. The first image sensor may be for capturing at least one image of at least part of a user. The determination unit may be for determining information relating to the user's eye based at least in part on at least one image captured by the first image sensor. The authentication unit may be for authenticating the user using the information relating to the user's eye.Type: ApplicationFiled: December 30, 2016Publication date: April 20, 2017Applicant: Tobii ABInventors: Mårten Skogö, Richard Hainzl, Henrik Jönsson, Anders Vennström, Erland George-Svahn, John Elvesjö
-
Patent number: 9619020Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A visual indicator can be presented on a display to indicate the location where a computer function will take place (e.g., a common cursor). The visual indicator can be moved to a gaze target in response to continued detection of an action (e.g., touchpad touch) by a user for a predetermined period of time. A delay between the action and the movement of the visual indicator can allow a user time to “abort” movement of the visual indicator. Additionally, once the visual indicator has moved, the visual indicator can be controlled with additional precision as the user moves the gaze while continuing the action (e.g., continued holding of the touchpad).Type: GrantFiled: March 3, 2014Date of Patent: April 11, 2017Assignee: Tobii ABInventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
-
Publication number: 20170090566Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.Type: ApplicationFiled: December 14, 2016Publication date: March 30, 2017Applicant: Tobii ABInventors: Erland George-Svahn, Dzenan Dzemidzic, Anders Vennström, Ida Nilsson, Anders Olsson, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20170091549Abstract: A system for determining a gaze direction of a user is disclosed. The system may include a first illuminator, a first profile sensor, and at least one processor. The first illuminator may be configured to illuminate an eye of a user. The first profile sensor may be configured to detect light reflected by the eye of the user. The processor(s) may be configured to determine a gaze direction of the user based at least in part on light detected by the first profile sensor.Type: ApplicationFiled: September 26, 2016Publication date: March 30, 2017Applicant: Tobii ABInventors: Simon Gustafsson, Daniel Johansson Tornéus, Jonas Andersson, Johannes Kron, Mårten Skogö, Ralf Biedert
-
Patent number: 9596391Abstract: According to the invention, a system for converting sound to electrical signals is disclosed. The system may include a gaze tracking device and a microphone. The gaze tracking device may determine a gaze direction of a user. The microphone may be more sensitive in a selected direction than at least one other direction and alter the selected direction based at least in part on the gaze direction determined by the gaze tracking device.Type: GrantFiled: August 5, 2014Date of Patent: March 14, 2017Assignee: Tobii ABInventors: David Figgins Henderek, Mårten Skogö
-
Patent number: 9580081Abstract: According to the invention, a method for modifying operation of at least one system of a vehicle based at least in part on a gaze direction of a driver is disclosed. The method may include receiving gaze data indicative of the gaze direction of a driver of a vehicle. The method may also include modifying operation of at least one system of the vehicle based at least in part on the gaze data.Type: GrantFiled: January 26, 2015Date of Patent: February 28, 2017Assignee: Tobii ABInventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20170038835Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: ApplicationFiled: November 30, 2015Publication date: February 9, 2017Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Publication number: 20170011492Abstract: According to the invention, a system for presenting graphics on a display device is disclosed. The system may include an eye tracking device and a graphics processing device. The eye tracking device may be for determining a gaze point of a user on a display device. The graphics processing device may be for causing graphics to be displayed on the display device. The graphics displayed on the display device may be modified such that the graphics are of higher quality in an area including the gaze point of the user, than outside the area.Type: ApplicationFiled: September 20, 2016Publication date: January 12, 2017Applicant: Tobii ABInventors: Robin Thunström, Mårten Skogö
-
Publication number: 20160353025Abstract: An imaging device adapted to provide eye-tracking data by imaging at least one eye of a viewer, wherein: the imaging device is switchable between at least an active mode and a ready mode; the imaging device is configured, in the active mode, to use active eye illumination, which enables tracking of a corneal reflection, and to provide eye tracking data which include eye position and eye orientation; and the imaging device is configured, in the ready mode, to reduce an illumination intensity from a value the illumination intensity has in the active mode.Type: ApplicationFiled: August 10, 2016Publication date: December 1, 2016Applicant: TOBII ABInventors: Henrik Eskilsson, Mårten Skogö, John Elvesjö, Peter Blixt
-
Publication number: 20160314349Abstract: An imaging device adapted to provide eye-tracking data by imaging at least one eye of a viewer, wherein: the imaging device is switchable between at least an active mode and a ready mode; the imaging device is configured to use active eye illumination in the active mode, which enables tracking of a corneal reflection; and the imaging device is configured, in the ready mode, to reduce an illumination intensity from a value the illumination intensity has in the active mode, and to provide eye-tracking data which include eye position but not eye orientation.Type: ApplicationFiled: July 6, 2016Publication date: October 27, 2016Applicant: Tobii ABInventors: Henrik Eskilsson, Mårten Skogö, John Elvesjö, Peter Blixt
-
Publication number: 20160307038Abstract: According to the invention a system for authenticating a user of a device is disclosed. The system may include a first image sensor, a determination unit, and an authentication unit. The first image sensor may be for capturing at least one image of at least part of a user. The determination unit may be for determining information relating to the user's eye based at least in part on at least one image captured by the first image sensor. The authentication unit may be for authenticating the user using the information relating to the user's eye.Type: ApplicationFiled: April 18, 2016Publication date: October 20, 2016Inventors: Mårten Skogö, Richard Hainzl, Henrik Jönsson, Anders Vennström, Erland George-Svahn, John Elvesjö
-
Patent number: 9442566Abstract: An imaging device adapted to provide eye-tracking data by imaging at least one eye of a viewer, wherein: the imaging device is switchable between at least an active mode and a ready mode; the imaging device is configured to use active eye illumination in the active mode, which enables tracking of a corneal reflection; and the imaging device is configured, in the ready mode, to reduce an illumination intensity from a value the illumination intensity has in the active mode, and to provide eye-tracking data which include eye position but not eye orientation.Type: GrantFiled: January 28, 2015Date of Patent: September 13, 2016Assignee: TOBII ABInventors: Henrik Eskilsson, Mårten Skogö, John Elvesjö, Peter Blixt
-
Publication number: 20160216761Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.Type: ApplicationFiled: March 31, 2016Publication date: July 28, 2016Inventors: Andreas Klingström, Mårten Skogö, Richard Hainzl, John Elvesjö
-
Publication number: 20160116980Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A computer can enlarge a portion of a display adjacent a first gaze target in response to detecting a first action (e.g., pressing a touchpad). The computer can then allow a user to position a second gaze target in the enlarged portion (e.g., by looking at the desired location) and perform a second action in order to perform a computer function at that location. The enlarging can allow a user to identify a desired location for a computer function (e.g., selecting an icon) with greater precision.Type: ApplicationFiled: December 31, 2015Publication date: April 28, 2016Inventors: Erland George-Svahn, Mårten Skogö
-
Publication number: 20160109947Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.Type: ApplicationFiled: December 31, 2015Publication date: April 21, 2016Inventors: Erland George-Svahn, Dzenan Dzemidzic, Anders Vennström, Ida Nilsson, Anders Olsson, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20150234459Abstract: According to the invention, a method for changing information on a display in a vehicle based on a gaze direction of a driver is disclosed. The method may include displaying information on the display in the vehicle. The method may also include receiving gaze data indicative of the gaze direction of a user. The method may further include changing the display based at least in part on the gaze data.Type: ApplicationFiled: January 26, 2015Publication date: August 20, 2015Inventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20150210292Abstract: According to the invention, a method for modifying operation of at least one system of a vehicle based at least in part on a gaze direction of a driver is disclosed. The method may include receiving gaze data indicative of the gaze direction of a driver of a vehicle. The method may also include modifying operation of at least one system of the vehicle based at least in part on the gaze data.Type: ApplicationFiled: January 26, 2015Publication date: July 30, 2015Inventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö