Patents by Inventor Niklas Blomqvist

Niklas Blomqvist has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11917126
    Abstract: An augmented reality, virtual reality, or other wearable apparatus comprises an eye tracking device comprising an image sensor, a lens, and one or more processors. In some embodiments, the lens comprises a marker, and the one or more processors are configured to receive an image from the image sensor, wherein the image shows the marker, determine a distance from the image sensor to the marker based on the image, and change a calibration parameter of an eye tracking algorithm based on the distance. In some embodiments, the one or more processors are configured to receive image data from the image sensor, wherein the image data corresponds to an image as observed through the lens, determine a level or pattern of pincushion distortion in the image based on the image data, and change a calibration parameter of an eye tracking algorithm based on the level or the pattern of pincushion distortion.
    Type: Grant
    Filed: February 24, 2023
    Date of Patent: February 27, 2024
    Assignee: Tobii AB
    Inventors: Jonas Andersson, Anders Clausen, Richard Hainzl, Anders Kingbäck, Simon Olin, Mark Ryan, Daniel Tornéus, Björn Nutti, Torbjörn Sundberg, Catarina Tidbeck, Ralf Biedert, Niklas Blomqvist, Dennis Rådell, Robin Thunström
  • Publication number: 20230217008
    Abstract: An augmented reality, virtual reality, or other wearable apparatus comprises an eye tracking device comprising an image sensor, a lens, and one or more processors. In some embodiments, the lens comprises a marker, and the one or more processors are configured to receive an image from the image sensor, wherein the image shows the marker, determine a distance from the image sensor to the marker based on the image, and change a calibration parameter of an eye tracking algorithm based on the distance. In some embodiments, the one or more processors are configured to receive image data from the image sensor, wherein the image data corresponds to an image as observed through the lens, determine a level or pattern of pincushion distortion in the image based on the image data, and change a calibration parameter of an eye tracking algorithm based on the level or the pattern of pincushion distortion.
    Type: Application
    Filed: February 24, 2023
    Publication date: July 6, 2023
    Applicant: Tobii AB
    Inventors: Jonas Andersson, Anders Clausen, Richard Hainzl, Anders Kingbäck, Simon Olin, Mark Ryan, Daniel Tornéus, Björn Nutti, Torbjörn Sundberg, Catarina Tidbeck, Ralf Biedert, Niklas Blomqvist, Dennis Rådell, Robin Thunström
  • Patent number: 11662595
    Abstract: Disclosed is a method for switching user input modality of a displaying device displaying an interactable region. The displaying device is in communication with a first input modality and a second input modality. The first input modality is an eye tracker configured to determine gaze data of a user and the second input modality is an input modality other than an eye tracker configured to determine a pointing ray. The first input modality is selected as the input modality of the displaying device. The method comprises determining whether the pointing ray of the second input modality is intersecting with the interactable region. The method further comprises, based on the determining switching the input modality of the displaying device to the second input modality when the pointing ray of the second input modality is intersecting with the interactable region.
    Type: Grant
    Filed: May 23, 2022
    Date of Patent: May 30, 2023
    Assignee: Tobii AB
    Inventors: Niklas Blomqvist, Robin Thunström, Dennis Rådell, Ralf Biedert
  • Patent number: 11622103
    Abstract: An augmented reality, virtual reality, or other wearable apparatus comprises an eye tracking device comprising an image sensor, a lens, and one or more processors. In some embodiments, the lens comprises a marker, and the one or more processors are configured to receive an image from the image sensor, wherein the image shows the marker, determine a distance from the image sensor to the marker based on the image, and change a calibration parameter of an eye tracking algorithm based on the distance. In some embodiments, the one or more processors are configured to receive image data from the image sensor, wherein the image data corresponds to an image as observed through the lens, determine a level or pattern of pincushion distortion in the image based on the image data, and change a calibration parameter of an eye tracking algorithm based on the level or the pattern of pincushion distortion.
    Type: Grant
    Filed: February 15, 2019
    Date of Patent: April 4, 2023
    Assignee: TOBII AB
    Inventors: Jonas Andersson, Anders Clausen, Richard Hainzl, Anders Kingbäck, Simon Olin, Mark Ryan, Daniel Tornéus, Björn Nutti, Torbjörn Sundberg, Catarina Tidbeck, Ralf Biedert, Niklas Blomqvist, Dennis Rådell, Robin Thunström
  • Publication number: 20220349982
    Abstract: A method performed by a system (120) for determining a position of a device (140) connected to a communication network (100) is disclosed. The method comprises obtaining a geographical area for the device based on a first position-estimation service, obtaining information on luminosity on the device over a time period, and determining a second position estimation for the device (140) based on the geographical area and on the information on luminosity, by comparing the information on luminosity over the time period to a 3D model of the geographical area, the 3D model comprising models of 3D objects of the geographical area and a model of sunlight shining onto the models of the 3D objects over the time period. Disclosed is further a corresponding system and a position-determining method performed by a communication device.
    Type: Application
    Filed: September 24, 2019
    Publication date: November 3, 2022
    Inventors: Alberto Gonzalez Escudero, Niklas Blomqvist, Ulf Händel
  • Publication number: 20220326536
    Abstract: Disclosed is a method for switching user input modality of a displaying device displaying an interactable region. The displaying device is in communication with a first input modality and a second input modality. The first input modality is an eye tracker configured to determine gaze data of a user and the second input modality is an input modality other than an eye tracker configured to determine a pointing ray. The first input modality is selected as the input modality of the displaying device. The method comprises determining whether the pointing ray of the second input modality is intersecting with the interactable region. The method further comprises, based on the determining switching the input modality of the displaying device to the second input modality when the pointing ray of the second input modality is intersecting with the interactable region.
    Type: Application
    Filed: May 23, 2022
    Publication date: October 13, 2022
    Applicant: Tobii AB
    Inventors: Niklas Blomqvist, Robin Thunström, Dennis Rådell, Ralf Biedert
  • Patent number: 11366329
    Abstract: Disclosed is a method for switching user input modality of a displaying device displaying an interactable region. The displaying device is in communication with a first input modality and a second input modality. The first input modality is an eye tracker configured to determine gaze data of a user and the second input modality is an input modality other than an eye tracker configured to determine a pointing ray. The first input modality is selected as the input modality of the displaying device. The method comprises determining whether the pointing ray of the second input modality is intersecting with the interactable region. The method further comprises, based on the determining switching the input modality of the displaying device to the second input modality when the pointing ray of the second input modality is intersecting with the interactable region.
    Type: Grant
    Filed: April 30, 2021
    Date of Patent: June 21, 2022
    Assignee: Tobii AB
    Inventors: Niklas Blomqvist, Robin Thunström, Dennis Rådell, Ralf Biedert
  • Publication number: 20220011584
    Abstract: Disclosed is a method for switching user input modality of a displaying device displaying an interactable region. The displaying device is in communication with a first input modality and a second input modality. The first input modality is an eye tracker configured to determine gaze data of a user and the second input modality is an input modality other than an eye tracker configured to determine a pointing ray. The first input modality is selected as the input modality of the displaying device. The method comprises determining whether the pointing ray of the second input modality is intersecting with the interactable region. The method further comprises, based on the determining switching the input modality of the displaying device to the second input modality when the pointing ray of the second input modality is intersecting with the interactable region.
    Type: Application
    Filed: April 30, 2021
    Publication date: January 13, 2022
    Applicant: Tobii AB
    Inventors: Niklas Blomqvist, Robin Thunström, Dennis Rådell, Ralf Biedert
  • Publication number: 20210342000
    Abstract: Techniques for interacting with a first computing device based on gaze information are described. In an example, the first computing device captures a gaze direction of a first user of the first computing device by using an eye tracking device. The first computing device displays a representation of a second user on a display of the first computing device. Further, the first computing device receives from the first user, communication data generated by an input device. The first computing device determines if the gaze direction of the first user is directed to the representation of the second user. If the gaze direction of the first user is directed to the representation of the second user, the first computing device transmits the communication data to a second computing device of the second user.
    Type: Application
    Filed: June 1, 2021
    Publication date: November 4, 2021
    Applicant: Tobii AB
    Inventors: Daniel Ricknäs, Erland George-Svahn, Rebecka Lannsjö, Andrew Ratcliff, Regimantas Vegele, Geoffrey Cooper, Niklas Blomqvist
  • Patent number: 11023040
    Abstract: Techniques for interacting with a first computing device based on gaze information are described. In an example, the first computing device captures a gaze direction of a first user of the first computing device by using an eye tracking device. The first computing device determines if the gaze direction of the first user is directed to a first display. Further, the first computing device receives information regarding if a gaze direction of a second user is directed to a second display. If the gaze direction of the first user is directed to the first display and the gaze direction of the second user is directed to the second display, the first computing device continuously updates content on the first display. If the gaze direction of the second user is not directed to the second display, the first computing device pauses the content on the first display.
    Type: Grant
    Filed: September 19, 2018
    Date of Patent: June 1, 2021
    Assignee: Tobii AB
    Inventors: Daniel Ricknäs, Erland George-Svahn, Rebecka Lannsjö, Andrew Ratcliff, Regimantas Vegele, Geoffrey Cooper, Niklas Blomqvist
  • Patent number: 10983359
    Abstract: Disclosed is a method for switching user input modality of a displaying device displaying an interactable region. The displaying device is in communication with a first input modality and a second input modality. The first input modality is an eye tracker configured to determine gaze data of a user and the second input modality is an input modality other than an eye tracker configured to determine a pointing ray. The first input modality is selected as the input modality of the displaying device. The method comprises determining whether the pointing ray of the second input modality is intersecting with the interactable region. The method further comprises, based on the determining switching the input modality of the displaying device to the second input modality when the pointing ray of the second input modality is intersecting with the interactable region.
    Type: Grant
    Filed: December 11, 2019
    Date of Patent: April 20, 2021
    Assignee: Tobii AB
    Inventors: Niklas Blomqvist, Robin Thunström, Dennis Rådell, Ralf Biedert
  • Publication number: 20210109591
    Abstract: An augmented reality, virtual reality, or other wearable apparatus comprises an eye tracking device comprising an image sensor, a lens, and one or more processors. In some embodiments, the lens comprises a marker, and the one or more processors are configured to receive an image from the image sensor, wherein the image shows the marker, determine a distance from the image sensor to the marker based on the image, and change a calibration parameter of an eye tracking algorithm based on the distance. In some embodiments, the one or more processors are configured to receive image data from the image sensor, wherein the image data corresponds to an image as observed through the lens, determine a level or pattern of pincushion distortion in the image based on the image data, and change a calibration parameter of an eye tracking algorithm based on the level or the pattern of pincushion distortion.
    Type: Application
    Filed: February 15, 2019
    Publication date: April 15, 2021
    Applicant: Tobii AB
    Inventors: Jonas Andersson, Anders Clausen, Richard Hainzl, Anders Kingbäck, Simon Olin, Mark Ryan, Daniel Tornéus, Björn Nutti, Torbjörn Sundberg, Catarina Tidbeck, Ralf Biedert, Niklas Blomqvist, Dennis Rådell, Robin Thunström
  • Patent number: 10928895
    Abstract: Techniques for interacting with a first computing device based on gaze information are described. In an example, the first computing device captures a gaze direction of a first user of the first computing device by using an eye tracking device. The first computing device displays a representation of a second user on a display of the first computing device. Further, the first computing device receives from the first user, communication data generated by an input device. The first computing device determines if the gaze direction of the first user is directed to the representation of the second user. If the gaze direction of the first user is directed to the representation of the second user, the first computing device transmits the communication data to a second computing device of the second user.
    Type: Grant
    Filed: September 19, 2018
    Date of Patent: February 23, 2021
    Assignee: Tobii AB
    Inventors: Daniel Ricknäs, Erland George-Svahn, Rebecka Lannsjö, Andrew Ratcliff, Regimantas Vegele, Geoffrey Cooper, Niklas Blomqvist
  • Publication number: 20200225494
    Abstract: Disclosed is a method for switching user input modality of a displaying device displaying an interactable region. The displaying device is in communication with a first input modality and a second input modality. The first input modality is an eye tracker configured to determine gaze data of a user and the second input modality is an input modality other than an eye tracker configured to determine a pointing ray. The first input modality is selected as the input modality of the displaying device. The method comprises determining whether the pointing ray of the second input modality is intersecting with the interactable region. The method further comprises, based on the determining switching the input modality of the displaying device to the second input modality when the pointing ray of the second input modality is intersecting with the interactable region.
    Type: Application
    Filed: December 11, 2019
    Publication date: July 16, 2020
    Applicant: Tobii AB
    Inventors: Niklas Blomqvist, Robin Thunström, Dennis Rådell, Ralf Biedert
  • Publication number: 20190086999
    Abstract: Techniques for interacting with a first computing device based on gaze information are described. In an example, the first computing device captures a gaze direction of a first user of the first computing device by using an eye tracking device. The first computing device displays a representation of a second user on a display of the first computing device. Further, the first computing device receives from the first user, communication data generated by an input device. The first computing device determines if the gaze direction of the first user is directed to the representation of the second user. If the gaze direction of the first user is directed to the representation of the second user, the first computing device transmits the communication data to a second computing device of the second user.
    Type: Application
    Filed: September 19, 2018
    Publication date: March 21, 2019
    Applicant: Tobii AB
    Inventors: Daniel Ricknäs, Erland George-Svahn, Rebecka Lannsjö, Andrew Ratcliff, Regimantas Vegele, Geoffrey Cooper, Niklas Blomqvist
  • Publication number: 20190087000
    Abstract: Techniques for interacting with a first computing device based on gaze information are described. In an example, the first computing device captures a gaze direction of a first user of the first computing device by using an eye tracking device. The first computing device determines if the gaze direction of the first user is directed to a first display. Further, the first computing device receives information regarding if a gaze direction of a second user is directed to a second display. If the gaze direction of the first user is directed to the first display and the gaze direction of the second user is directed to the second display, the first computing device continuously updates content on the first display. If the gaze direction of the second user is not directed to the second display, the first computing device pauses the content on the first display.
    Type: Application
    Filed: September 19, 2018
    Publication date: March 21, 2019
    Applicant: Tobii AB
    Inventors: Daniel Ricknäs, Erland George-Svahn, Rebecka Lannsjö, Andrew Ratcliff, Regimantas Vegele, Geoffrey Cooper, Niklas Blomqvist