Patents by Inventor Rebecka Lannsjö
Rebecka Lannsjö has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20190086999Abstract: Techniques for interacting with a first computing device based on gaze information are described. In an example, the first computing device captures a gaze direction of a first user of the first computing device by using an eye tracking device. The first computing device displays a representation of a second user on a display of the first computing device. Further, the first computing device receives from the first user, communication data generated by an input device. The first computing device determines if the gaze direction of the first user is directed to the representation of the second user. If the gaze direction of the first user is directed to the representation of the second user, the first computing device transmits the communication data to a second computing device of the second user.Type: ApplicationFiled: September 19, 2018Publication date: March 21, 2019Applicant: Tobii ABInventors: Daniel Ricknäs, Erland George-Svahn, Rebecka Lannsjö, Andrew Ratcliff, Regimantas Vegele, Geoffrey Cooper, Niklas Blomqvist
-
Publication number: 20180224933Abstract: According to the invention, a method for changing information on a display in a vehicle based on a gaze direction of a driver is disclosed. The method may include displaying information on the display in the vehicle. The method may also include receiving gaze data indicative of the gaze direction of a user. The method may further include changing the display based at least in part on the gaze data.Type: ApplicationFiled: November 10, 2017Publication date: August 9, 2018Applicant: Tobii ABInventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Patent number: 10035518Abstract: According to the invention, a method for modifying operation of at least one system of a vehicle based at least in part on a gaze direction of a driver is disclosed. The method may include receiving gaze data indicative of the gaze direction of a driver of a vehicle. The method may also include modifying operation of at least one system of the vehicle based at least in part on the gaze data.Type: GrantFiled: February 23, 2017Date of Patent: July 31, 2018Assignee: Tobii ABInventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Patent number: 9817474Abstract: According to the invention, a method for changing information on a display in a vehicle based on a gaze direction of a driver is disclosed. The method may include displaying information on the display in the vehicle. The method may also include receiving gaze data indicative of the gaze direction of a user. The method may further include changing the display based at least in part on the gaze data.Type: GrantFiled: January 26, 2015Date of Patent: November 14, 2017Assignee: Tobii ABInventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20170291613Abstract: According to the invention, a method for modifying operation of at least one system of a vehicle based at least in part on a gaze direction of a driver is disclosed. The method may include receiving gaze data indicative of the gaze direction of a driver of a vehicle. The method may also include modifying operation of at least one system of the vehicle based at least in part on the gaze data.Type: ApplicationFiled: February 23, 2017Publication date: October 12, 2017Applicant: Tobii ABInventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20170177079Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A visual indicator can be presented on a display to indicate the location where a computer function will take place (e.g., a common cursor). The visual indicator can be moved to a gaze target in response to continued detection of an action (e.g., touchpad touch) by a user for a predetermined period of time. A delay between the action and the movement of the visual indicator can allow a user time to “abort” movement of the visual indicator. Additionally, once the visual indicator has moved, the visual indicator can be controlled with additional precision as the user moves the gaze while continuing the action (e.g., continued holding of the touchpad).Type: ApplicationFiled: March 3, 2017Publication date: June 22, 2017Applicant: Tobii ABInventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
-
Patent number: 9619020Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A visual indicator can be presented on a display to indicate the location where a computer function will take place (e.g., a common cursor). The visual indicator can be moved to a gaze target in response to continued detection of an action (e.g., touchpad touch) by a user for a predetermined period of time. A delay between the action and the movement of the visual indicator can allow a user time to “abort” movement of the visual indicator. Additionally, once the visual indicator has moved, the visual indicator can be controlled with additional precision as the user moves the gaze while continuing the action (e.g., continued holding of the touchpad).Type: GrantFiled: March 3, 2014Date of Patent: April 11, 2017Assignee: Tobii ABInventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
-
Publication number: 20170090566Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.Type: ApplicationFiled: December 14, 2016Publication date: March 30, 2017Applicant: Tobii ABInventors: Erland George-Svahn, Dzenan Dzemidzic, Anders Vennström, Ida Nilsson, Anders Olsson, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20170083088Abstract: According to the invention, a method for changing a display based at least in part on a gaze point of a user on the display is disclosed. The method may include receiving information identifying a location of the gaze point of the user on the display. The method may also include, based at least in part on the location of the gaze point, causing a virtual camera perspective to change, thereby causing content on the display associated with the virtual camera to change.Type: ApplicationFiled: December 2, 2016Publication date: March 23, 2017Applicant: Tobii ABInventors: Rebecka Lannsjö, Henrik Björk
-
Patent number: 9580081Abstract: According to the invention, a method for modifying operation of at least one system of a vehicle based at least in part on a gaze direction of a driver is disclosed. The method may include receiving gaze data indicative of the gaze direction of a driver of a vehicle. The method may also include modifying operation of at least one system of the vehicle based at least in part on the gaze data.Type: GrantFiled: January 26, 2015Date of Patent: February 28, 2017Assignee: Tobii ABInventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20160109947Abstract: A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands. A solution for touch-screen like interaction uses gaze input and gesture based input as a complement or an alternative to touch-screen interactions with a computer device having a touch-screen. Combined gaze and gesture based interaction with graphical user interfaces can be used to achieve a touchscreen like environment in computer systems without a traditional touchscreen or in computer systems having a touchscreen arranged ergonomically unfavorable for the user or a touchscreen arranged such that it is more comfortable for the user to use gesture and gaze for the interaction than the touchscreen.Type: ApplicationFiled: December 31, 2015Publication date: April 21, 2016Inventors: Erland George-Svahn, Dzenan Dzemidzic, Anders Vennström, Ida Nilsson, Anders Olsson, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20150234459Abstract: According to the invention, a method for changing information on a display in a vehicle based on a gaze direction of a driver is disclosed. The method may include displaying information on the display in the vehicle. The method may also include receiving gaze data indicative of the gaze direction of a user. The method may further include changing the display based at least in part on the gaze data.Type: ApplicationFiled: January 26, 2015Publication date: August 20, 2015Inventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20150210292Abstract: According to the invention, a method for modifying operation of at least one system of a vehicle based at least in part on a gaze direction of a driver is disclosed. The method may include receiving gaze data indicative of the gaze direction of a driver of a vehicle. The method may also include modifying operation of at least one system of the vehicle based at least in part on the gaze data.Type: ApplicationFiled: January 26, 2015Publication date: July 30, 2015Inventors: Erland George-Svahn, Rebecka Lannsjö, Mårten Skogö
-
Publication number: 20150143293Abstract: According to the invention, a method for selecting a program from a list of programs is disclosed. The method may include receiving an indication that a first non-gaze input has been received. The method may also include causing a list of programs to be shown on a display. The method may further include receiving information identifying a location of the gaze point of the user on the display. The method may additionally include receiving an indication that a second non-gaze input has been received. The method may moreover include, based at least in part on the location of the gaze point, causing a program from the list to be shown on the display upon receipt of the second non-gaze input.Type: ApplicationFiled: November 18, 2014Publication date: May 21, 2015Inventors: Erland George-Svahn, Rebecka Lannsjö, Aron Yu
-
Publication number: 20150138244Abstract: According to the invention, a method for changing a display based at least in part on a gaze point of a user on the display is disclosed. The method may include receiving information identifying a location of the gaze point of the user on the display. The method may also include, based at least in part on the location of the gaze point, causing the display to scroll content on the display.Type: ApplicationFiled: November 18, 2014Publication date: May 21, 2015Inventors: Erland George-Svahn, Rebecka Lannsjö, Aron Yu
-
Publication number: 20150138079Abstract: According to the invention, a method for changing a display based at least in part on a gaze point of a user on the display is disclosed. The method may include receiving information identifying a location of the gaze point of the user on the display. The method may also include, based at least in part on the location of the gaze point, causing a virtual camera perspective to change, thereby causing content on the display associated with the virtual camera to change.Type: ApplicationFiled: January 20, 2015Publication date: May 21, 2015Inventor: Rebecka Lannsjö
-
Publication number: 20140247232Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A computer can enlarge a portion of a display adjacent a first gaze target in response to detecting a first action (e.g., pressing a touchpad). The computer can then allow a user to position a second gaze target in the enlarged portion (e.g., by looking at the desired location) and perform a second action in order to perform a computer function at that location. The enlarging can allow a user to identify a desired location for a computer function (e.g., selecting an icon) with greater precision.Type: ApplicationFiled: March 3, 2014Publication date: September 4, 2014Applicant: TOBII TECHNOLOGY ABInventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö
-
Publication number: 20140247215Abstract: A computer system can be controlled with non-contact inputs, such as eye-tracking devices. A visual indicator can be presented on a display to indicate the location where a computer function will take place (e.g., a common cursor). The visual indicator can be moved to a gaze target in response to continued detection of an action (e.g., touchpad touch) by a user for a predetermined period of time. The delay between the action and the movement of the visual indicator can allow a user time to “abort” movement of the visual indicator. Additionally, once the visual indicator has moved, the visual indicator can be controlled with additional precision as the user moves gaze while continuing the action (e.g., continued holding of the touchpad).Type: ApplicationFiled: March 3, 2014Publication date: September 4, 2014Applicant: Tobii Technology ABInventors: Erland George-Svahn, David Figgins Henderek, Rebecka Lannsjö, Mårten Skogö, John Elvesjö