Patents by Inventor Anders Vennstrom

Anders Vennstrom has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20180335838
    Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.
    Type: Application
    Filed: May 21, 2018
    Publication date: November 22, 2018
    Applicant: Tobii AB
    Inventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö
  • Publication number: 20180321903
    Abstract: According to the invention, a method for providing audio to a user is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user on a display. The method may also include causing, with a computer system, an audio device to produce audio to the user, where content of the audio may be based at least in part on the gaze point of the user on the display.
    Type: Application
    Filed: July 10, 2018
    Publication date: November 8, 2018
    Applicant: Tobii AB
    Inventors: Anders Vennström, Fredrik Lindh
  • Patent number: 10055191
    Abstract: According to the invention, a method for providing audio to a user is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user on a display. The method may also include causing, with a computer system, an audio device to produce audio to the user, where content of the audio may be based at least in part on the gaze point of the user on the display.
    Type: Grant
    Filed: December 29, 2015
    Date of Patent: August 21, 2018
    Assignee: Tobii AB
    Inventors: Anders Vennström, Fredrik Lindh
  • Patent number: 10025381
    Abstract: A control module for generating gesture based commands during user interaction with an information presentation area is provided. The control module is configured to acquire user input from a touchpad and gaze data signals from a gaze tracking module; and determine at least one user generated gesture based control command based on a user removing contact of a finger of the user with the touchpad; determine a gaze point area on the information presentation area including the user's gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on the graphical information presentation area based on the determined gaze point area and at least one user generated gesture based control command, wherein the user action is executed at said determined gaze point area.
    Type: Grant
    Filed: January 20, 2015
    Date of Patent: July 17, 2018
    Assignee: Tobii AB
    Inventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö, Erland George-Svahn
  • Patent number: 10013053
    Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.
    Type: Grant
    Filed: October 5, 2012
    Date of Patent: July 3, 2018
    Assignee: Tobii AB
    Inventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö
  • Publication number: 20180157323
    Abstract: There is provided methods, systems and computer program products for identifying an interaction element from one or more interaction elements present in a user interface, comprising: receiving gaze information from an eye tracking system; determining the likelihood that an interaction element is the subject of the gaze information from the eye tracking system; and identifying an interaction element from the one or more interaction elements based on said likelihood determination.
    Type: Application
    Filed: November 27, 2017
    Publication date: June 7, 2018
    Applicant: Tobii AB
    Inventors: Anders Vennström, Mårten Skogö, John Elvesjö
  • Publication number: 20180129469
    Abstract: According to the invention, a method for providing audio to a user is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user on a display. The method may also include causing, with a computer system, an audio device to produce audio to the user, where content of the audio may be based at least in part on the gaze point of the user on the display.
    Type: Application
    Filed: August 21, 2017
    Publication date: May 10, 2018
    Applicant: TOBII AB
    Inventors: Anders Vennström, Fredrik Lindh
  • Patent number: 9829973
    Abstract: There is provided methods, systems and computer program products for identifying an interaction element from one or more interaction elements present in a user interface, comprising: receiving gaze information from an eye tracking system; determining the likelihood that an interaction element is the subject of the gaze information from the eye tracking system; and identifying an interaction element from the one or more interaction elements based on said likelihood determination.
    Type: Grant
    Filed: December 23, 2014
    Date of Patent: November 28, 2017
    Assignee: Tobii AB
    Inventors: Anders Vennström, Mårten Skogö, John Elvesjö
  • Patent number: 9740452
    Abstract: According to the invention, a method for providing audio to a user is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user on a display. The method may also include causing, with a computer system, an audio device to produce audio to the user, where content of the audio may be based at least in part on the gaze point of the user on the display.
    Type: Grant
    Filed: September 21, 2015
    Date of Patent: August 22, 2017
    Assignee: Tobii AB
    Inventors: Anders Vennström, Fredrik Lindh
  • Publication number: 20170109513
    Abstract: According to the invention a system for authenticating a user of a device is disclosed. The system may include a first image sensor, a determination unit, and an authentication unit. The first image sensor may be for capturing at least one image of at least part of a user. The determination unit may be for determining information relating to the user's eye based at least in part on at least one image captured by the first image sensor. The authentication unit may be for authenticating the user using the information relating to the user's eye.
    Type: Application
    Filed: December 30, 2016
    Publication date: April 20, 2017
    Applicant: Tobii AB
    Inventors: Mårten Skogö, Richard Hainzl, Henrik Jönsson, Anders Vennström, Erland George-Svahn, John Elvesjö
  • Publication number: 20170090566
    Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.
    Type: Application
    Filed: December 14, 2016
    Publication date: March 30, 2017
    Applicant: Tobii AB
    Inventors: Erland George-Svahn, Dzenan Dzemidzic, Anders Vennström, Ida Nilsson, Anders Olsson, Rebecka Lannsjö, Mårten Skogö
  • Publication number: 20160357255
    Abstract: A method for determining if a user's gaze is directed in the direction of a zone of interest in a 3D scene comprises: providing a 3D scene containing a zone of interest; associating a property with the zone of interest; creating a bitmap representing the location of the zone of interest in a projected view of the 3D scene, each pixel of the bitmap to which the zone of interest is projected storing the property of the zone of interest; detecting the direction of the user's gaze; using the bitmap to determine if the detected user's gaze is directed in the direction of the zone of interest.
    Type: Application
    Filed: June 2, 2016
    Publication date: December 8, 2016
    Inventors: Fredrik Lindh, Mattias Gustavsson, Anders Vennstrom, Andreas Edling
  • Publication number: 20160307038
    Abstract: According to the invention a system for authenticating a user of a device is disclosed. The system may include a first image sensor, a determination unit, and an authentication unit. The first image sensor may be for capturing at least one image of at least part of a user. The determination unit may be for determining information relating to the user's eye based at least in part on at least one image captured by the first image sensor. The authentication unit may be for authenticating the user using the information relating to the user's eye.
    Type: Application
    Filed: April 18, 2016
    Publication date: October 20, 2016
    Inventors: Mårten Skogö, Richard Hainzl, Henrik Jönsson, Anders Vennström, Erland George-Svahn, John Elvesjö
  • Publication number: 20160132289
    Abstract: According to the invention, a method for providing audio to a user is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user on a display. The method may also include causing, with a computer system, an audio device to produce audio to the user, where content of the audio may be based at least in part on the gaze point of the user on the display.
    Type: Application
    Filed: December 29, 2015
    Publication date: May 12, 2016
    Inventors: Anders Vennström, Fredrik Lindh
  • Publication number: 20160109947
    Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.
    Type: Application
    Filed: December 31, 2015
    Publication date: April 21, 2016
    Inventors: Erland George-Svahn, Dzenan Dzemidzic, Anders Vennström, Ida Nilsson, Anders Olsson, Rebecka Lannsjö, Mårten Skogö
  • Publication number: 20160105757
    Abstract: According to the invention, a method for providing audio to a user is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user on a display. The method may also include causing, with a computer system, an audio device to produce audio to the user, where content of the audio may be based at least in part on the gaze point of the user on the display.
    Type: Application
    Filed: September 21, 2015
    Publication date: April 14, 2016
    Inventors: Anders Vennström, Fredrik Lindh
  • Patent number: 9143880
    Abstract: According to the invention, a method for providing audio to a user is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user on a display. The method may also include causing, with a computer system, an audio device to produce audio to the user, where content of the audio may be based at least in part on the gaze point of the user on the display.
    Type: Grant
    Filed: August 25, 2014
    Date of Patent: September 22, 2015
    Assignee: TOBII AB
    Inventors: Anders Vennström, Fredrik Lindh
  • Publication number: 20150177833
    Abstract: There is provided methods, systems and computer program products for identifying an interaction element from one or more interaction elements present in a user interface, comprising: receiving gaze information from an eye tracking system; determining the likelihood that an interaction element is the subject of the gaze information from the eye tracking system; and identifying an interaction element from the one or more interaction elements based on said likelihood determination.
    Type: Application
    Filed: December 23, 2014
    Publication date: June 25, 2015
    Inventors: Anders VENNSTRÖM, Mårten SKOGÖ, John ELVESJÖ
  • Publication number: 20150058812
    Abstract: According to the invention, a method for changing the behavior of computer program elements is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user. The method may also include causing, with a computer system, an interactive event controlled by the computer system to alter its behavior based at least in part on the gaze point of the user.
    Type: Application
    Filed: August 25, 2014
    Publication date: February 26, 2015
    Inventors: Fredrik Lindh, Anders Vennström, Anders Olsson
  • Publication number: 20150055808
    Abstract: According to the invention, a method for providing audio to a user is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user on a display. The method may also include causing, with a computer system, an audio device to produce audio to the user, where content of the audio may be based at least in part on the gaze point of the user on the display.
    Type: Application
    Filed: August 25, 2014
    Publication date: February 26, 2015
    Inventors: Anders Vennström, Fredrik Lindh