Patents by Inventor Jonas Högström

Jonas Högström has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240167708
    Abstract: The present invention relates to a computer implemented method for controlling temperature and humidity of an air in stream, the method comprising receiving parameters indicative of a temperature and water content of an airstream in a downstream M section and of a temperature and water content of a medium in the system, and further determining in processing circuitry a desired temperature change and desired water content change of the first medium as a first function f1 based on the received parameters and also based on a second function that defines a relationship between the air temperature and the air water content as co-dependent variables; and also generating a first and second control signal configured to apply the desired temperature change and the desired water content change to the first medium. The invention also relates to a corresponding system.
    Type: Application
    Filed: March 21, 2022
    Publication date: May 23, 2024
    Inventors: Fredrik EDSTRÖM, Per DAHLBÄCK, Jonas HÖGSTROM
  • Patent number: 11941170
    Abstract: The invention is related to a method and system for calibrating an eye tracking device configured to track a gaze point of a user on a display The method comprises: presenting a video on the display to a user, the video having a start size and a start position; tracking the gaze of the user, using an image sensor of the eye tracking device; and sequentially completing, for at least one calibration position, the steps of: resizing the video to a calibration size, wherein the calibration size is smaller than the start size, and translating the video to a calibration position; recording calibration data, using the eye tracking device, for the user viewing the video in the calibration position; and resizing the video to a second size that is greater than the start size.
    Type: Grant
    Filed: March 17, 2022
    Date of Patent: March 26, 2024
    Assignee: Tobii AB
    Inventors: Sergey Slobodenyuk, Mikkel Rasmussen, Andreas Jansson, Thomas Gaudy, Evgeniia Farkhutdinova, Jonas Högström, Richard Andersson
  • Patent number: 11943420
    Abstract: A user monitoring system receives a first data stream from a first recording device and a second data stream from a second recording device. Each of the first data stream and the second data stream include data relating to an eye of the user. The first data stream and the second data stream overlap temporally. The system processes the first data stream to determine a first blink sequence of the user, processes the second data stream to determine a second blink sequence of the user, and compares the first blink sequence and the second blink sequence to detect a blink pattern present in both the first blink sequence and the second blink sequence. The system determines a temporal offset of the first data stream and the second data stream by comparing respective positions of the blink pattern in the first data stream and the second data stream.
    Type: Grant
    Filed: June 30, 2022
    Date of Patent: March 26, 2024
    Assignee: Tobii AB
    Inventors: Jonas Högström, Erik Alsmyr
  • Publication number: 20230007224
    Abstract: A user monitoring system receives a first data stream from a first recording device and a second data stream from a second recording device. Each of the first data stream and the second data stream include data relating to an eye of the user. The first data stream and the second data stream overlap temporally. The system processes the first data stream to determine a first blink sequence of the user, processes the second data stream to determine a second blink sequence of the user, and compares the first blink sequence and the second blink sequence to detect a blink pattern present in both the first blink sequence and the second blink sequence. The system determines a temporal offset of the first data stream and the second data stream by comparing respective positions of the blink pattern in the first data stream and the second data stream.
    Type: Application
    Filed: June 30, 2022
    Publication date: January 5, 2023
    Inventors: Jonas HOGSTROM, Erik ALSMYR
  • Publication number: 20220317768
    Abstract: The invention is related to a method and system for calibrating an eye tracking device configured to track a gaze point of a user on a display The method comprises: presenting a video on the display to a user, the video having a start size and a start position; tracking the gaze of the user, using an image sensor of the eye tracking device; and sequentially completing, for at least one calibration position, the steps of: resizing the video to a calibration size, wherein the calibration size is smaller than the start size, and translating the video to a calibration position; recording calibration data, using the eye tracking device, for the user viewing the video in the calibration position; and resizing the video to a second size that is greater than the start size.
    Type: Application
    Filed: March 17, 2022
    Publication date: October 6, 2022
    Applicant: Tobii AB
    Inventors: Sergey Slobodenyuk, Mikkel Rasmussen, Andreas Jansson, Thomas Gaudy, Evgeniia Farkhutdinova, Jonas Högström, Richard Andersson
  • Patent number: 11269405
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: August 31, 2017
    Date of Patent: March 8, 2022
    Assignee: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20200192472
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: August 31, 2017
    Publication date: June 18, 2020
    Applicant: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20200019239
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: June 25, 2019
    Publication date: January 16, 2020
    Applicant: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Patent number: 10331209
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: December 11, 2018
    Date of Patent: June 25, 2019
    Assignee: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20190121429
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: December 11, 2018
    Publication date: April 25, 2019
    Applicant: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Patent number: 10228763
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: September 12, 2018
    Date of Patent: March 12, 2019
    Assignee: Tobii AB
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Patent number: 10185394
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: December 11, 2017
    Date of Patent: January 22, 2019
    Assignee: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20190011986
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: September 12, 2018
    Publication date: January 10, 2019
    Applicant: Tobii AB
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Patent number: 10114459
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: November 10, 2017
    Date of Patent: October 30, 2018
    Assignee: Tobii AB
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20180101228
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: December 11, 2017
    Publication date: April 12, 2018
    Applicant: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20180088668
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: November 10, 2017
    Publication date: March 29, 2018
    Applicant: Tobii AB
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Patent number: 9870051
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: August 31, 2017
    Date of Patent: January 16, 2018
    Assignee: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20170364150
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: August 31, 2017
    Publication date: December 21, 2017
    Applicant: Tobii AB
    Inventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Patent number: 9829976
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Grant
    Filed: November 30, 2015
    Date of Patent: November 28, 2017
    Assignee: Tobii AB
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
  • Publication number: 20170038835
    Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.
    Type: Application
    Filed: November 30, 2015
    Publication date: February 9, 2017
    Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong