Patents by Inventor Erland George-Svahn
Erland George-Svahn has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10417323Abstract: The present invention relates in general to the field of applications and functions of an IR-camera device operated by a user in connection with the recording of IR images and to processing of IR images on a computer application program. A system for managing annotations to IR images comprising selectable annotation input functions that are actuatable by means of control commands displayed on the display is disclosed.Type: GrantFiled: August 22, 2016Date of Patent: September 17, 2019Assignee: FLIR SYSTEMS ABInventors: Mikael Erlandsson, Erland George-Svahn, Torsten Sandbäck
-
Publication number: 20190272030Abstract: According to the invention, a method for changing information on a display in a vehicle based on a gaze direction of a driver is disclosed. The method may include displaying information on the display in the vehicle. The method may also include receiving gaze data indicative of the gaze direction of a user. The method may further include changing the display based at least in part on the gaze data.Type: ApplicationFiled: May 1, 2019Publication date: September 5, 2019Applicant: Tobii ABInventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Patent number: 10394320Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.Type: GrantFiled: December 14, 2016Date of Patent: August 27, 2019Assignee: Tobii ABInventors: Erland George-Svahn, Dzenan Dzemidzic, Anders Vennström, Ida Nilsson, Anders Olsson, Rebecka Lannsjö, Mårten Skogö
-
Patent number: 10324527Abstract: According to the invention, a method for changing information on a display in a vehicle based on a gaze direction of a driver is disclosed. The method may include displaying information on the display in the vehicle. The method may also include receiving gaze data indicative of the gaze direction of a user. The method may further include changing the display based at least in part on the gaze data.Type: GrantFiled: November 10, 2017Date of Patent: June 18, 2019Assignee: Tobii ABInventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20190138093Abstract: Techniques for updating a user interface on a computer system are described. In an example, the computer system detects a gaze position on the user interface. The user interface includes a first open window and a second open window. The first open window is in a foreground of the user interface. The second open window is in a background of the user interface. The computer system determines a region of the user interface around the gaze position. The computer system also displays a viewer based on the region. The viewer is displayed in the foreground of the user interface and shows a screenshot of at least a portion of the second open window.Type: ApplicationFiled: November 9, 2018Publication date: May 9, 2019Applicant: Tobii ABInventors: Daniel Ricknäs, Erland George-Svahn, Rebecka Lannsjö, Regimantas Vegele, Andrew Ratcliff, Guido Hermans, Mattias Hanqvist, Simon Hugosson, Dmitrios Koufos, Morgan Viktorsson, Jonas Alexanderson, Siavash Moghaddam, Jimmy Carlsten, Martin Chrzan
-
Publication number: 20190138090Abstract: According to the invention, a method for modifying operation of at least one system of a vehicle based at least in part on a gaze direction of a driver is disclosed. The method may include receiving gaze data indicative of the gaze direction of a driver of a vehicle. The method may also include modifying operation of at least one system of the vehicle based at least in part on the gaze data.Type: ApplicationFiled: July 30, 2018Publication date: May 9, 2019Applicant: Tobii ABInventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20190138738Abstract: Techniques for changing the presentation of information on a user interface based on presence are described. In an example, a computer system determines, based on an image sensor associated with the system, a first presence of a first user relative to a computing device. The computer system also determines an identifier of the first user. The identifier is associated with operating the computing device. The operating comprises a presentation of the user interface by the computing device. The computer system also determines, based on the image sensor, a second presence of a second person relative to the computing device. The computer system causes an update to the user interface based on the second presence.Type: ApplicationFiled: November 9, 2018Publication date: May 9, 2019Applicant: Tobii ABInventors: Daniel Ricknäs, Erland George-Svahn, Rebecka Lannsjö, Regimantas Vegele, Andrew Ratcliff, Guido Hermans, Mattias Hanqvist, Simon Hugosson, Dmitrios Koufos, Morgan Viktorsson, Jonas Alexanderson, Siavash Moghaddam, Jimmy Carlsten, Martin Chrzan
-
Publication number: 20190138740Abstract: Techniques for data sharing between two computing devices are described. In an example, a computer system determines a first presence of a first user relative to a first computing device. The computer system also determines a first identifier of the first user. The first identifier is associated with operating the first computing device. The operating comprises sharing data with a second computing device. The computer system also determines a second presence of a second user relative to the second computing device. The computer system also determines a second identifier of the second user. The second identifier associated with operating the second computing device. The computer system cause the data to be shared with the second computing device based on the first presence, the first identifier, the second presence, and the second identifier.Type: ApplicationFiled: November 9, 2018Publication date: May 9, 2019Applicant: Tobii ABInventors: Daniel Ricknäs, Erland George-Svahn, Rebecka Lannsjö, Regimantas Vegele, Andrew Ratcliff, Guido Hermans, Mattias Hanqvist, Simon Hugosson, Dmitrios Koufos, Morgan Viktorsson, Jonas Alexanderson, Siavash Moghaddam, Jimmy Carlsten, Martin Chrzan
-
Publication number: 20190087000Abstract: Techniques for interacting with a first computing device based on gaze information are described. In an example, the first computing device captures a gaze direction of a first user of the first computing device by using an eye tracking device. The first computing device determines if the gaze direction of the first user is directed to a first display. Further, the first computing device receives information regarding if a gaze direction of a second user is directed to a second display. If the gaze direction of the first user is directed to the first display and the gaze direction of the second user is directed to the second display, the first computing device continuously updates content on the first display. If the gaze direction of the second user is not directed to the second display, the first computing device pauses the content on the first display.Type: ApplicationFiled: September 19, 2018Publication date: March 21, 2019Applicant: Tobii ABInventors: Daniel Ricknäs, Erland George-Svahn, Rebecka Lannsjö, Andrew Ratcliff, Regimantas Vegele, Geoffrey Cooper, Niklas Blomqvist
-
Publication number: 20190086999Abstract: Techniques for interacting with a first computing device based on gaze information are described. In an example, the first computing device captures a gaze direction of a first user of the first computing device by using an eye tracking device. The first computing device displays a representation of a second user on a display of the first computing device. Further, the first computing device receives from the first user, communication data generated by an input device. The first computing device determines if the gaze direction of the first user is directed to the representation of the second user. If the gaze direction of the first user is directed to the representation of the second user, the first computing device transmits the communication data to a second computing device of the second user.Type: ApplicationFiled: September 19, 2018Publication date: March 21, 2019Applicant: Tobii ABInventors: Daniel Ricknäs, Erland George-Svahn, Rebecka Lannsjö, Andrew Ratcliff, Regimantas Vegele, Geoffrey Cooper, Niklas Blomqvist
-
Patent number: 10192109Abstract: According to the invention a system for authenticating a user of a device is disclosed. The system may include a first image sensor, a determination unit, and an authentication unit. The first image sensor may be for capturing at least one image of at least part of a user. The determination unit may be for determining information relating to the user's eye based at least in part on at least one image captured by the first image sensor. The authentication unit may be for authenticating the user using the information relating to the user's eye.Type: GrantFiled: April 18, 2016Date of Patent: January 29, 2019Assignee: Tobii ABInventors: Mårten Skogö, Richard Hainzl, Henrik Jönsson, Anders Vennström, Erland George-Svahn, John Elvesjö
-
Publication number: 20180364802Abstract: A control module for generating gesture based commands during user interaction with an information presentation area is provided. The control module is configured to acquire user input from a touchpad and gaze data signals from a gaze tracking module; and determine at least one user generated gesture based control command based on a user removing contact of a finger of the user with the touchpad; determine a gaze point area on the information presentation area including the user's gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on the graphical information presentation area based on the determined gaze point area and at least one user generated gesture based control command, wherein the user action is executed at said determined gaze point area.Type: ApplicationFiled: June 14, 2018Publication date: December 20, 2018Applicant: Tobii ABInventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö, Erland George-Svahn
-
Publication number: 20180224933Abstract: According to the invention, a method for changing information on a display in a vehicle based on a gaze direction of a driver is disclosed. The method may include displaying information on the display in the vehicle. The method may also include receiving gaze data indicative of the gaze direction of a user. The method may further include changing the display based at least in part on the gaze data.Type: ApplicationFiled: November 10, 2017Publication date: August 9, 2018Applicant: Tobii ABInventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Patent number: 10035518Abstract: According to the invention, a method for modifying operation of at least one system of a vehicle based at least in part on a gaze direction of a driver is disclosed. The method may include receiving gaze data indicative of the gaze direction of a driver of a vehicle. The method may also include modifying operation of at least one system of the vehicle based at least in part on the gaze data.Type: GrantFiled: February 23, 2017Date of Patent: July 31, 2018Assignee: Tobii ABInventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Patent number: 10025381Abstract: A control module for generating gesture based commands during user interaction with an information presentation area is provided. The control module is configured to acquire user input from a touchpad and gaze data signals from a gaze tracking module; and determine at least one user generated gesture based control command based on a user removing contact of a finger of the user with the touchpad; determine a gaze point area on the information presentation area including the user's gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on the graphical information presentation area based on the determined gaze point area and at least one user generated gesture based control command, wherein the user action is executed at said determined gaze point area.Type: GrantFiled: January 20, 2015Date of Patent: July 17, 2018Assignee: Tobii ABInventors: Markus Cederlund, Robert Gavelin, Anders Vennström, Anders Kaplan, Anders Olsson, Mårten Skogö, Erland George-Svahn
-
Patent number: 9817474Abstract: According to the invention, a method for changing information on a display in a vehicle based on a gaze direction of a driver is disclosed. The method may include displaying information on the display in the vehicle. The method may also include receiving gaze data indicative of the gaze direction of a user. The method may further include changing the display based at least in part on the gaze data.Type: GrantFiled: January 26, 2015Date of Patent: November 14, 2017Assignee: Tobii ABInventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Patent number: 9797777Abstract: A method for marking and analyzing at least one object in an IR image, for an embodiment, comprises receiving an image of an object scene comprising at least one object, receiving a first input control signal indicating pixel coordinates of a first selected object scene portion, locking a first marker of a camera on a first object region corresponding to said pixel coordinates in said object scene in response to said first input control signal, wherein said input control signal is generated by a user activating an input means by a single action. The invention for various embodiments also relates to an IR camera, a computer program product and an image processing system comprising such an IR camera.Type: GrantFiled: March 30, 2012Date of Patent: October 24, 2017Assignee: FLIR Systems ABInventors: Erland George-Svahn, Torbjörn Johansson
-
Publication number: 20170291613Abstract: According to the invention, a method for modifying operation of at least one system of a vehicle based at least in part on a gaze direction of a driver is disclosed. The method may include receiving gaze data indicative of the gaze direction of a driver of a vehicle. The method may also include modifying operation of at least one system of the vehicle based at least in part on the gaze data.Type: ApplicationFiled: February 23, 2017Publication date: October 12, 2017Applicant: Tobii ABInventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20170235360Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.Type: ApplicationFiled: February 27, 2017Publication date: August 17, 2017Applicant: Tobii ABInventor: Erland George-Svahn
-
Publication number: 20170177079Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A visual indicator can be presented on a display to indicate the location where a computer function will take place (e.g., a common cursor). The visual indicator can be moved to a gaze target in response to continued detection of an action (e.g., touchpad touch) by a user for a predetermined period of time. A delay between the action and the movement of the visual indicator can allow a user time to “abort” movement of the visual indicator. Additionally, once the visual indicator has moved, the visual indicator can be controlled with additional precision as the user moves the gaze while continuing the action (e.g., continued holding of the touchpad).Type: ApplicationFiled: March 3, 2017Publication date: June 22, 2017Applicant: Tobii ABInventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö