Patents by Inventor Jonas Högström

Jonas Högström has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11943420
    Abstract: A user monitoring system receives a first data stream from a first recording device and a second data stream from a second recording device. Each of the first data stream and the second data stream include data relating to an eye of the user. The first data stream and the second data stream overlap temporally. The system processes the first data stream to determine a first blink sequence of the user, processes the second data stream to determine a second blink sequence of the user, and compares the first blink sequence and the second blink sequence to detect a blink pattern present in both the first blink sequence and the second blink sequence. The system determines a temporal offset of the first data stream and the second data stream by comparing respective positions of the blink pattern in the first data stream and the second data stream.
    Type: Grant
    Filed: June 30, 2022
    Date of Patent: March 26, 2024
    Assignee: Tobii AB
    Inventors: Jonas Högström, Erik Alsmyr
  • Patent number: 11941170
    Abstract: The invention is related to a method and system for calibrating an eye tracking device configured to track a gaze point of a user on a display The method comprises: presenting a video on the display to a user, the video having a start size and a start position; tracking the gaze of the user, using an image sensor of the eye tracking device; and sequentially completing, for at least one calibration position, the steps of: resizing the video to a calibration size, wherein the calibration size is smaller than the start size, and translating the video to a calibration position; recording calibration data, using the eye tracking device, for the user viewing the video in the calibration position; and resizing the video to a second size that is greater than the start size.
    Type: Grant
    Filed: March 17, 2022
    Date of Patent: March 26, 2024
    Assignee: Tobii AB
    Inventors: Sergey Slobodenyuk, Mikkel Rasmussen, Andreas Jansson, Thomas Gaudy, Evgeniia Farkhutdinova, Jonas Högström, Richard Andersson
  • Publication number: 20220317768
    Abstract: The invention is related to a method and system for calibrating an eye tracking device configured to track a gaze point of a user on a display The method comprises: presenting a video on the display to a user, the video having a start size and a start position; tracking the gaze of the user, using an image sensor of the eye tracking device; and sequentially completing, for at least one calibration position, the steps of: resizing the video to a calibration size, wherein the calibration size is smaller than the start size, and translating the video to a calibration position; recording calibration data, using the eye tracking device, for the user viewing the video in the calibration position; and resizing the video to a second size that is greater than the start size.
    Type: Application
    Filed: March 17, 2022
    Publication date: October 6, 2022
    Applicant: Tobii AB
    Inventors: Sergey Slobodenyuk, Mikkel Rasmussen, Andreas Jansson, Thomas Gaudy, Evgeniia Farkhutdinova, Jonas Högström, Richard Andersson
  • Patent number: 11269405
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: August 31, 2017
    Date of Patent: March 8, 2022
    Assignee: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20200192472
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: August 31, 2017
    Publication date: June 18, 2020
    Applicant: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20200019239
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: June 25, 2019
    Publication date: January 16, 2020
    Applicant: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Patent number: 10331209
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: December 11, 2018
    Date of Patent: June 25, 2019
    Assignee: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20190121429
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: December 11, 2018
    Publication date: April 25, 2019
    Applicant: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Patent number: 10228763
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: September 12, 2018
    Date of Patent: March 12, 2019
    Assignee: Tobii AB
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Patent number: 10185394
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: December 11, 2017
    Date of Patent: January 22, 2019
    Assignee: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20190011986
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: September 12, 2018
    Publication date: January 10, 2019
    Applicant: Tobii AB
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Patent number: 10114459
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: November 10, 2017
    Date of Patent: October 30, 2018
    Assignee: Tobii AB
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20180101228
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: December 11, 2017
    Publication date: April 12, 2018
    Applicant: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20180088668
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: November 10, 2017
    Publication date: March 29, 2018
    Applicant: Tobii AB
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Patent number: 9870051
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: August 31, 2017
    Date of Patent: January 16, 2018
    Assignee: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20170364150
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: August 31, 2017
    Publication date: December 21, 2017
    Applicant: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Patent number: 9829976
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: November 30, 2015
    Date of Patent: November 28, 2017
    Assignee: Tobii AB
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20170038835
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: November 30, 2015
    Publication date: February 9, 2017
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20140011267
    Abstract: A disintegrating system for treatment of organic material may include multiple disintegrating units, each having an inlet for receiving material, and an outlet for outputting treated material. A first inlet of a first disintegrating unit may be configured to receive organic material, and a first feedback pipe may be connected between the outlet of the first disintegrating unit and the inlet of a second disintegrating unit. An outflow of the disintegrating system may be connected to the outlet of at least the first disintegrating unit, wherein the sum of the introduced material is available at the outflow.
    Type: Application
    Filed: September 9, 2013
    Publication date: January 9, 2014
    Applicant: SCANDINAVIAN BIOGAS FUELS AB
    Inventors: Jörgen EJLERTSSON, Jonas HÖGSTRÖM, Mark EASINGWOOD
  • Publication number: 20100311155
    Abstract: The present invention relates to a disintegrating system for treatment of organic material comprising: multiple disintegrating units (31, 32; 41, 42; 51, 52, 53) each having an inlet (33, 34; 43, 44; 55, 56, 57) for receiving material A, B, C, and an outlet (37, 38; 47, 48; 61, 62, 63) for outputting treated material. A first inlet (33; 43; 55) of a first disintegrating unit (31; 41; 51) is configured to receive organic material A, and a first feedback pipe (35; 45; 58, 59) is connected between the outlet (37; 47; 61) of the first disintegrating unit and the inlet (34; 44; 56) of a second disintegrating unit. An outflow (39; 49; 64) of the disintegrating system (30; 40; 50) is connected to the outlet (37; 47; 61) of at least the first disintegrating unit (31; 41; 51), wherein the sum of the introduced material A, B, C is available at the outflow (39; 49; 64).
    Type: Application
    Filed: December 4, 2008
    Publication date: December 9, 2010
    Inventors: Jorgen Ejlertsson, Jonas Högström, Mark Easingwood