Patents by Inventor Erland George-Svahn

Erland George-Svahn has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20170109513
    Abstract: According to the invention a system for authenticating a user of a device is disclosed. The system may include a first image sensor, a determination unit, and an authentication unit. The first image sensor may be for capturing at least one image of at least part of a user. The determination unit may be for determining information relating to the user's eye based at least in part on at least one image captured by the first image sensor. The authentication unit may be for authenticating the user using the information relating to the user's eye.
    Type: Application
    Filed: December 30, 2016
    Publication date: April 20, 2017
    Applicant: Tobii AB
    Inventors: Mårten Skogö, Richard Hainzl, Henrik Jönsson, Anders Vennström, Erland George-Svahn, John Elvesjö
  • Patent number: 9619020
    Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A visual indicator can be presented on a display to indicate the location where a computer function will take place (e.g., a common cursor). The visual indicator can be moved to a gaze target in response to continued detection of an action (e.g., touchpad touch) by a user for a predetermined period of time. A delay between the action and the movement of the visual indicator can allow a user time to “abort” movement of the visual indicator. Additionally, once the visual indicator has moved, the visual indicator can be controlled with additional precision as the user moves the gaze while continuing the action (e.g., continued holding of the touchpad).
    Type: Grant
    Filed: March 3, 2014
    Date of Patent: April 11, 2017
    Assignee: Tobii AB
    Inventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
  • Publication number: 20170090566
    Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.
    Type: Application
    Filed: December 14, 2016
    Publication date: March 30, 2017
    Applicant: Tobii AB
    Inventors: Erland George-Svahn, Dzenan Dzemidzic, Anders Vennström, Ida Nilsson, Anders Olsson, Rebecka Lannsjö, Mårten Skogö
  • Patent number: 9580081
    Abstract: According to the invention, a method for modifying operation of at least one system of a vehicle based at least in part on a gaze direction of a driver is disclosed. The method may include receiving gaze data indicative of the gaze direction of a driver of a vehicle. The method may also include modifying operation of at least one system of the vehicle based at least in part on the gaze data.
    Type: Grant
    Filed: January 26, 2015
    Date of Patent: February 28, 2017
    Assignee: Tobii AB
    Inventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
  • Publication number: 20170046322
    Abstract: The present invention relates in general to the field of applications and functions of an IR-camera device operated by a user in connection with the recording of IR images and to processing of IR images on a computer application program. A system for managing annotations to IR images comprising selectable annotation input functions that are actuatable by means of control commands displayed on the display is disclosed.
    Type: Application
    Filed: August 22, 2016
    Publication date: February 16, 2017
    Inventors: Mikael Erlandsson, Erland George-Svahn, Torsten Sandbäck
  • Publication number: 20160307038
    Abstract: According to the invention a system for authenticating a user of a device is disclosed. The system may include a first image sensor, a determination unit, and an authentication unit. The first image sensor may be for capturing at least one image of at least part of a user. The determination unit may be for determining information relating to the user's eye based at least in part on at least one image captured by the first image sensor. The authentication unit may be for authenticating the user using the information relating to the user's eye.
    Type: Application
    Filed: April 18, 2016
    Publication date: October 20, 2016
    Inventors: Mårten Skogö, Richard Hainzl, Henrik Jönsson, Anders Vennström, Erland George-Svahn, John Elvesjö
  • Patent number: 9423940
    Abstract: The present invention relates in general to the field of applications and functions of an IR-camera device operated by a user in connection with the recording of IR images and to processing of IR images on a computer application program. A system for managing annotations to IR images comprising selectable annotation input functions that are actuatable by means of control commands displayed on the display is disclosed.
    Type: Grant
    Filed: July 26, 2013
    Date of Patent: August 23, 2016
    Assignee: FLIR SYSTEMS AB
    Inventors: Mikael Erlandsson, Erland George-Svahn, Torsten Sandbäck
  • Publication number: 20160116980
    Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A computer can enlarge a portion of a display adjacent a first gaze target in response to detecting a first action (e.g., pressing a touchpad). The computer can then allow a user to position a second gaze target in the enlarged portion (e.g., by looking at the desired location) and perform a second action in order to perform a computer function at that location. The enlarging can allow a user to identify a desired location for a computer function (e.g., selecting an icon) with greater precision.
    Type: Application
    Filed: December 31, 2015
    Publication date: April 28, 2016
    Inventors: Erland George-Svahn, Mårten Skogö
  • Publication number: 20160109947
    Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.
    Type: Application
    Filed: December 31, 2015
    Publication date: April 21, 2016
    Inventors: Erland George-Svahn, Dzenan Dzemidzic, Anders Vennström, Ida Nilsson, Anders Olsson, Rebecka Lannsjö, Mårten Skogö
  • Publication number: 20160109946
    Abstract: According to the invention, a method for dismissing information from a display device based on a gaze input is disclosed. The method may include determining that an object has been displayed on a display device. The method may also include determining an area on the display device associated with the object. The method may further include determining a gaze location of a user on the display device. The method may additionally include causing the object to not be displayed on the display device, based at least in part on the gaze location being located within the area for at least a first predetermined length of time.
    Type: Application
    Filed: October 21, 2015
    Publication date: April 21, 2016
    Applicant: TOBII AB
    Inventor: Erland George-Svahn
  • Publication number: 20150234459
    Abstract: According to the invention, a method for changing information on a display in a vehicle based on a gaze direction of a driver is disclosed. The method may include displaying information on the display in the vehicle. The method may also include receiving gaze data indicative of the gaze direction of a user. The method may further include changing the display based at least in part on the gaze data.
    Type: Application
    Filed: January 26, 2015
    Publication date: August 20, 2015
    Inventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
  • Publication number: 20150210292
    Abstract: According to the invention, a method for modifying operation of at least one system of a vehicle based at least in part on a gaze direction of a driver is disclosed. The method may include receiving gaze data indicative of the gaze direction of a driver of a vehicle. The method may also include modifying operation of at least one system of the vehicle based at least in part on the gaze data.
    Type: Application
    Filed: January 26, 2015
    Publication date: July 30, 2015
    Inventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
  • Publication number: 20150143293
    Abstract: According to the invention, a method for selecting a program from a list of programs is disclosed. The method may include receiving an indication that a first non-gaze input has been received. The method may also include causing a list of programs to be shown on a display. The method may further include receiving information identifying a location of the gaze point of the user on the display. The method may additionally include receiving an indication that a second non-gaze input has been received. The method may moreover include, based at least in part on the location of the gaze point, causing a program from the list to be shown on the display upon receipt of the second non-gaze input.
    Type: Application
    Filed: November 18, 2014
    Publication date: May 21, 2015
    Inventors: Erland George-Svahn, Rebecka Lannsjö, Aron Yu
  • Publication number: 20150138244
    Abstract: According to the invention, a method for changing a display based at least in part on a gaze point of a user on the display is disclosed. The method may include receiving information identifying a location of the gaze point of the user on the display. The method may also include, based at least in part on the location of the gaze point, causing the display to scroll content on the display.
    Type: Application
    Filed: November 18, 2014
    Publication date: May 21, 2015
    Inventors: Erland George-Svahn, Rebecka Lannsjö, Aron Yu
  • Publication number: 20150130740
    Abstract: A control module for generating gesture based commands during user interaction with an information presentation area is provided. The control module is configured to acquire user input from a touchpad and gaze data signals from a gaze tracking module; and determine at least one user generated gesture based control command based on a user removing contact of a finger of the user with the touchpad; determine a gaze point area on the information presentation area including the user's gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on the graphical information presentation area based on the determined gaze point area and at least one user generated gesture based control command, wherein the user action is executed at said determined gaze point area.
    Type: Application
    Filed: January 20, 2015
    Publication date: May 14, 2015
    Applicant: TOBII TECHNOLOGY AB
    Inventors: Markus CEDERLUND, Robert GAVELIN, Anders ENNSTRÖM, Anders KAPLAN, Anders OLSSON, Mårten SKOGÖ, Erland GEORGE-SVAHN
  • Publication number: 20140247232
    Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A computer can enlarge a portion of a display adjacent a first gaze target in response to detecting a first action (e.g., pressing a touchpad). The computer can then allow a user to position a second gaze target in the enlarged portion (e.g., by looking at the desired location) and perform a second action in order to perform a computer function at that location. The enlarging can allow a user to identify a desired location for a computer function (e.g., selecting an icon) with greater precision.
    Type: Application
    Filed: March 3, 2014
    Publication date: September 4, 2014
    Applicant: TOBII TECHNOLOGY AB
    Inventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
  • Publication number: 20140247215
    Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A visual indicator can be presented on a display to indicate the location where a computer function will take place (e.g., a common cursor). The visual indicator can be moved to a gaze target in response to continued detection of an action (e.g., touchpad touch) by a user for a predetermined period of time. The delay between the action and the movement of the visual indicator can allow a user time to “abort” movement of the visual indicator. Additionally, once the visual indicator has moved, the visual indicator can be controlled with additional precision as the user moves gaze while continuing the action (e.g., continued holding of the touchpad).
    Type: Application
    Filed: March 3, 2014
    Publication date: September 4, 2014
    Applicant: Tobii Technology AB
    Inventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
  • Publication number: 20140013201
    Abstract: The present invention relates in general to the field of applications and functions of an IR-camera device operated by a user in connection with the recording of IR images and to processing of IR images on a computer application program. A system for managing annotations to IR images comprising selectable annotation input functions that are actuatable by means of control commands displayed on the display is disclosed.
    Type: Application
    Filed: July 26, 2013
    Publication date: January 9, 2014
    Applicant: FLIR Systems AB
    Inventors: Mikael Erlandsson, Erland George-Svahn, Torsten Sandbäck
  • Patent number: 8599262
    Abstract: Embodiments of the invention relate to an IR camera for capturing thermal images of an imaged view, the IR camera comprising an IR camera display configured to display the captured thermal images to a user of the IR camera; a display control unit configured to control the IR camera display to further display a current temperature range of the IR camera. The IR camera is characterized in that the display control unit is configured to determine an indication scale comprising the entire current temperature range of the IR camera wherein equally sized temperature intervals have different geometric size in the indication scale based on upon the actual image content of the captured thermal image, and control the IR camera display to display the indication scale to a user of the IR camera. Embodiments of the invention further relate to a method for use in displaying captured thermal images of an IR camera and a computer-readable medium encoded with executable instructions for the same.
    Type: Grant
    Filed: April 20, 2009
    Date of Patent: December 3, 2013
    Assignee: Flir Systems AB
    Inventors: Erland George-Svahn, Rasmus Mattsson, Mikael Erlandsson, Torsten Sandbäck
  • Publication number: 20130307992
    Abstract: This disclosure relates in general to the field of visualizing, imaging and animating groups of images, and annotations in IR-cameras. Various techniques are provided for a method of managing IR image data on a group level. For example, the method may comprise; capturing an IR image comprising temperature data representing the temperature variance of an object scene; storing the IR image as a first data item in a predetermined data structure; storing a second data item in said predetermined data structure; and associating in said data structure the first and the second data item such that an operation is enabled on the first and the second associated data items jointly as a group of data items.
    Type: Application
    Filed: July 26, 2013
    Publication date: November 21, 2013
    Applicant: FLIR Systems AB
    Inventors: Mikael Erlandsson, Erland George-Svahn, Torsten Sandbäck