Patents by Inventor Erland George-Svahn
Erland George-Svahn has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20170109513Abstract: According to the invention a system for authenticating a user of a device is disclosed. The system may include a first image sensor, a determination unit, and an authentication unit. The first image sensor may be for capturing at least one image of at least part of a user. The determination unit may be for determining information relating to the user's eye based at least in part on at least one image captured by the first image sensor. The authentication unit may be for authenticating the user using the information relating to the user's eye.Type: ApplicationFiled: December 30, 2016Publication date: April 20, 2017Applicant: Tobii ABInventors: Mårten Skogö, Richard Hainzl, Henrik Jönsson, Anders Vennström, Erland George-Svahn, John Elvesjö
-
Patent number: 9619020Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A visual indicator can be presented on a display to indicate the location where a computer function will take place (e.g., a common cursor). The visual indicator can be moved to a gaze target in response to continued detection of an action (e.g., touchpad touch) by a user for a predetermined period of time. A delay between the action and the movement of the visual indicator can allow a user time to “abort” movement of the visual indicator. Additionally, once the visual indicator has moved, the visual indicator can be controlled with additional precision as the user moves the gaze while continuing the action (e.g., continued holding of the touchpad).Type: GrantFiled: March 3, 2014Date of Patent: April 11, 2017Assignee: Tobii ABInventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
-
Publication number: 20170090566Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.Type: ApplicationFiled: December 14, 2016Publication date: March 30, 2017Applicant: Tobii ABInventors: Erland George-Svahn, Dzenan Dzemidzic, Anders Vennström, Ida Nilsson, Anders Olsson, Rebecka Lannsjö, Mårten Skogö
-
Patent number: 9580081Abstract: According to the invention, a method for modifying operation of at least one system of a vehicle based at least in part on a gaze direction of a driver is disclosed. The method may include receiving gaze data indicative of the gaze direction of a driver of a vehicle. The method may also include modifying operation of at least one system of the vehicle based at least in part on the gaze data.Type: GrantFiled: January 26, 2015Date of Patent: February 28, 2017Assignee: Tobii ABInventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20170046322Abstract: The present invention relates in general to the field of applications and functions of an IR-camera device operated by a user in connection with the recording of IR images and to processing of IR images on a computer application program. A system for managing annotations to IR images comprising selectable annotation input functions that are actuatable by means of control commands displayed on the display is disclosed.Type: ApplicationFiled: August 22, 2016Publication date: February 16, 2017Inventors: Mikael Erlandsson, Erland George-Svahn, Torsten Sandbäck
-
Publication number: 20160307038Abstract: According to the invention a system for authenticating a user of a device is disclosed. The system may include a first image sensor, a determination unit, and an authentication unit. The first image sensor may be for capturing at least one image of at least part of a user. The determination unit may be for determining information relating to the user's eye based at least in part on at least one image captured by the first image sensor. The authentication unit may be for authenticating the user using the information relating to the user's eye.Type: ApplicationFiled: April 18, 2016Publication date: October 20, 2016Inventors: Mårten Skogö, Richard Hainzl, Henrik Jönsson, Anders Vennström, Erland George-Svahn, John Elvesjö
-
Patent number: 9423940Abstract: The present invention relates in general to the field of applications and functions of an IR-camera device operated by a user in connection with the recording of IR images and to processing of IR images on a computer application program. A system for managing annotations to IR images comprising selectable annotation input functions that are actuatable by means of control commands displayed on the display is disclosed.Type: GrantFiled: July 26, 2013Date of Patent: August 23, 2016Assignee: FLIR SYSTEMS ABInventors: Mikael Erlandsson, Erland George-Svahn, Torsten Sandbäck
-
Publication number: 20160116980Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A computer can enlarge a portion of a display adjacent a first gaze target in response to detecting a first action (e.g., pressing a touchpad). The computer can then allow a user to position a second gaze target in the enlarged portion (e.g., by looking at the desired location) and perform a second action in order to perform a computer function at that location. The enlarging can allow a user to identify a desired location for a computer function (e.g., selecting an icon) with greater precision.Type: ApplicationFiled: December 31, 2015Publication date: April 28, 2016Inventors: Erland George-Svahn, Mårten Skogö
-
Publication number: 20160109947Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.Type: ApplicationFiled: December 31, 2015Publication date: April 21, 2016Inventors: Erland George-Svahn, Dzenan Dzemidzic, Anders Vennström, Ida Nilsson, Anders Olsson, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20160109946Abstract: According to the invention, a method for dismissing information from a display device based on a gaze input is disclosed. The method may include determining that an object has been displayed on a display device. The method may also include determining an area on the display device associated with the object. The method may further include determining a gaze location of a user on the display device. The method may additionally include causing the object to not be displayed on the display device, based at least in part on the gaze location being located within the area for at least a first predetermined length of time.Type: ApplicationFiled: October 21, 2015Publication date: April 21, 2016Applicant: TOBII ABInventor: Erland George-Svahn
-
Publication number: 20150234459Abstract: According to the invention, a method for changing information on a display in a vehicle based on a gaze direction of a driver is disclosed. The method may include displaying information on the display in the vehicle. The method may also include receiving gaze data indicative of the gaze direction of a user. The method may further include changing the display based at least in part on the gaze data.Type: ApplicationFiled: January 26, 2015Publication date: August 20, 2015Inventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20150210292Abstract: According to the invention, a method for modifying operation of at least one system of a vehicle based at least in part on a gaze direction of a driver is disclosed. The method may include receiving gaze data indicative of the gaze direction of a driver of a vehicle. The method may also include modifying operation of at least one system of the vehicle based at least in part on the gaze data.Type: ApplicationFiled: January 26, 2015Publication date: July 30, 2015Inventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20150143293Abstract: According to the invention, a method for selecting a program from a list of programs is disclosed. The method may include receiving an indication that a first non-gaze input has been received. The method may also include causing a list of programs to be shown on a display. The method may further include receiving information identifying a location of the gaze point of the user on the display. The method may additionally include receiving an indication that a second non-gaze input has been received. The method may moreover include, based at least in part on the location of the gaze point, causing a program from the list to be shown on the display upon receipt of the second non-gaze input.Type: ApplicationFiled: November 18, 2014Publication date: May 21, 2015Inventors: Erland George-Svahn, Rebecka Lannsjö, Aron Yu
-
Publication number: 20150138244Abstract: According to the invention, a method for changing a display based at least in part on a gaze point of a user on the display is disclosed. The method may include receiving information identifying a location of the gaze point of the user on the display. The method may also include, based at least in part on the location of the gaze point, causing the display to scroll content on the display.Type: ApplicationFiled: November 18, 2014Publication date: May 21, 2015Inventors: Erland George-Svahn, Rebecka Lannsjö, Aron Yu
-
Publication number: 20150130740Abstract: A control module for generating gesture based commands during user interaction with an information presentation area is provided. The control module is configured to acquire user input from a touchpad and gaze data signals from a gaze tracking module; and determine at least one user generated gesture based control command based on a user removing contact of a finger of the user with the touchpad; determine a gaze point area on the information presentation area including the user's gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on the graphical information presentation area based on the determined gaze point area and at least one user generated gesture based control command, wherein the user action is executed at said determined gaze point area.Type: ApplicationFiled: January 20, 2015Publication date: May 14, 2015Applicant: TOBII TECHNOLOGY ABInventors: Markus CEDERLUND, Robert GAVELIN, Anders ENNSTRÖM, Anders KAPLAN, Anders OLSSON, Mårten SKOGÖ, Erland GEORGE-SVAHN
-
Publication number: 20140247232Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A computer can enlarge a portion of a display adjacent a first gaze target in response to detecting a first action (e.g., pressing a touchpad). The computer can then allow a user to position a second gaze target in the enlarged portion (e.g., by looking at the desired location) and perform a second action in order to perform a computer function at that location. The enlarging can allow a user to identify a desired location for a computer function (e.g., selecting an icon) with greater precision.Type: ApplicationFiled: March 3, 2014Publication date: September 4, 2014Applicant: TOBII TECHNOLOGY ABInventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
-
Publication number: 20140247215Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A visual indicator can be presented on a display to indicate the location where a computer function will take place (e.g., a common cursor). The visual indicator can be moved to a gaze target in response to continued detection of an action (e.g., touchpad touch) by a user for a predetermined period of time. The delay between the action and the movement of the visual indicator can allow a user time to “abort” movement of the visual indicator. Additionally, once the visual indicator has moved, the visual indicator can be controlled with additional precision as the user moves gaze while continuing the action (e.g., continued holding of the touchpad).Type: ApplicationFiled: March 3, 2014Publication date: September 4, 2014Applicant: Tobii Technology ABInventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
-
Publication number: 20140013201Abstract: The present invention relates in general to the field of applications and functions of an IR-camera device operated by a user in connection with the recording of IR images and to processing of IR images on a computer application program. A system for managing annotations to IR images comprising selectable annotation input functions that are actuatable by means of control commands displayed on the display is disclosed.Type: ApplicationFiled: July 26, 2013Publication date: January 9, 2014Applicant: FLIR Systems ABInventors: Mikael Erlandsson, Erland George-Svahn, Torsten Sandbäck
-
Patent number: 8599262Abstract: Embodiments of the invention relate to an IR camera for capturing thermal images of an imaged view, the IR camera comprising an IR camera display configured to display the captured thermal images to a user of the IR camera; a display control unit configured to control the IR camera display to further display a current temperature range of the IR camera. The IR camera is characterized in that the display control unit is configured to determine an indication scale comprising the entire current temperature range of the IR camera wherein equally sized temperature intervals have different geometric size in the indication scale based on upon the actual image content of the captured thermal image, and control the IR camera display to display the indication scale to a user of the IR camera. Embodiments of the invention further relate to a method for use in displaying captured thermal images of an IR camera and a computer-readable medium encoded with executable instructions for the same.Type: GrantFiled: April 20, 2009Date of Patent: December 3, 2013Assignee: Flir Systems ABInventors: Erland George-Svahn, Rasmus Mattsson, Mikael Erlandsson, Torsten Sandbäck
-
Publication number: 20130307992Abstract: This disclosure relates in general to the field of visualizing, imaging and animating groups of images, and annotations in IR-cameras. Various techniques are provided for a method of managing IR image data on a group level. For example, the method may comprise; capturing an IR image comprising temperature data representing the temperature variance of an object scene; storing the IR image as a first data item in a predetermined data structure; storing a second data item in said predetermined data structure; and associating in said data structure the first and the second data item such that an operation is enabled on the first and the second associated data items jointly as a group of data items.Type: ApplicationFiled: July 26, 2013Publication date: November 21, 2013Applicant: FLIR Systems ABInventors: Mikael Erlandsson, Erland George-Svahn, Torsten Sandbäck