Patents by Inventor Jonas Högström
Jonas Högström has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240167708Abstract: The present invention relates to a computer implemented method for controlling temperature and humidity of an air in stream, the method comprising receiving parameters indicative of a temperature and water content of an airstream in a downstream M section and of a temperature and water content of a medium in the system, and further determining in processing circuitry a desired temperature change and desired water content change of the first medium as a first function f1 based on the received parameters and also based on a second function that defines a relationship between the air temperature and the air water content as co-dependent variables; and also generating a first and second control signal configured to apply the desired temperature change and the desired water content change to the first medium. The invention also relates to a corresponding system.Type: ApplicationFiled: March 21, 2022Publication date: May 23, 2024Inventors: Fredrik EDSTRÖM, Per DAHLBÄCK, Jonas HÖGSTROM
-
Patent number: 11941170Abstract: The invention is related to a method and system for calibrating an eye tracking device configured to track a gaze point of a user on a display The method comprises: presenting a video on the display to a user, the video having a start size and a start position; tracking the gaze of the user, using an image sensor of the eye tracking device; and sequentially completing, for at least one calibration position, the steps of: resizing the video to a calibration size, wherein the calibration size is smaller than the start size, and translating the video to a calibration position; recording calibration data, using the eye tracking device, for the user viewing the video in the calibration position; and resizing the video to a second size that is greater than the start size.Type: GrantFiled: March 17, 2022Date of Patent: March 26, 2024Assignee: Tobii ABInventors: Sergey Slobodenyuk, Mikkel Rasmussen, Andreas Jansson, Thomas Gaudy, Evgeniia Farkhutdinova, Jonas Högström, Richard Andersson
-
Patent number: 11943420Abstract: A user monitoring system receives a first data stream from a first recording device and a second data stream from a second recording device. Each of the first data stream and the second data stream include data relating to an eye of the user. The first data stream and the second data stream overlap temporally. The system processes the first data stream to determine a first blink sequence of the user, processes the second data stream to determine a second blink sequence of the user, and compares the first blink sequence and the second blink sequence to detect a blink pattern present in both the first blink sequence and the second blink sequence. The system determines a temporal offset of the first data stream and the second data stream by comparing respective positions of the blink pattern in the first data stream and the second data stream.Type: GrantFiled: June 30, 2022Date of Patent: March 26, 2024Assignee: Tobii ABInventors: Jonas Högström, Erik Alsmyr
-
Publication number: 20230007224Abstract: A user monitoring system receives a first data stream from a first recording device and a second data stream from a second recording device. Each of the first data stream and the second data stream include data relating to an eye of the user. The first data stream and the second data stream overlap temporally. The system processes the first data stream to determine a first blink sequence of the user, processes the second data stream to determine a second blink sequence of the user, and compares the first blink sequence and the second blink sequence to detect a blink pattern present in both the first blink sequence and the second blink sequence. The system determines a temporal offset of the first data stream and the second data stream by comparing respective positions of the blink pattern in the first data stream and the second data stream.Type: ApplicationFiled: June 30, 2022Publication date: January 5, 2023Inventors: Jonas HOGSTROM, Erik ALSMYR
-
Publication number: 20220317768Abstract: The invention is related to a method and system for calibrating an eye tracking device configured to track a gaze point of a user on a display The method comprises: presenting a video on the display to a user, the video having a start size and a start position; tracking the gaze of the user, using an image sensor of the eye tracking device; and sequentially completing, for at least one calibration position, the steps of: resizing the video to a calibration size, wherein the calibration size is smaller than the start size, and translating the video to a calibration position; recording calibration data, using the eye tracking device, for the user viewing the video in the calibration position; and resizing the video to a second size that is greater than the start size.Type: ApplicationFiled: March 17, 2022Publication date: October 6, 2022Applicant: Tobii ABInventors: Sergey Slobodenyuk, Mikkel Rasmussen, Andreas Jansson, Thomas Gaudy, Evgeniia Farkhutdinova, Jonas Högström, Richard Andersson
-
Patent number: 11269405Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: GrantFiled: August 31, 2017Date of Patent: March 8, 2022Assignee: Tobii ABInventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Publication number: 20200192472Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: ApplicationFiled: August 31, 2017Publication date: June 18, 2020Applicant: Tobii ABInventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Publication number: 20200019239Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: ApplicationFiled: June 25, 2019Publication date: January 16, 2020Applicant: Tobii ABInventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Patent number: 10331209Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: GrantFiled: December 11, 2018Date of Patent: June 25, 2019Assignee: Tobii ABInventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Publication number: 20190121429Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: ApplicationFiled: December 11, 2018Publication date: April 25, 2019Applicant: Tobii ABInventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Patent number: 10228763Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: GrantFiled: September 12, 2018Date of Patent: March 12, 2019Assignee: Tobii ABInventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Patent number: 10185394Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: GrantFiled: December 11, 2017Date of Patent: January 22, 2019Assignee: Tobii ABInventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Publication number: 20190011986Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: ApplicationFiled: September 12, 2018Publication date: January 10, 2019Applicant: Tobii ABInventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Patent number: 10114459Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: GrantFiled: November 10, 2017Date of Patent: October 30, 2018Assignee: Tobii ABInventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Publication number: 20180101228Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: ApplicationFiled: December 11, 2017Publication date: April 12, 2018Applicant: Tobii ABInventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Publication number: 20180088668Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: ApplicationFiled: November 10, 2017Publication date: March 29, 2018Applicant: Tobii ABInventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Patent number: 9870051Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: GrantFiled: August 31, 2017Date of Patent: January 16, 2018Assignee: Tobii ABInventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Publication number: 20170364150Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: ApplicationFiled: August 31, 2017Publication date: December 21, 2017Applicant: Tobii ABInventors: André Lovtjärn, Jesper Högström, Jonas Högström, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Patent number: 9829976Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: GrantFiled: November 30, 2015Date of Patent: November 28, 2017Assignee: Tobii ABInventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong
-
Publication number: 20170038835Abstract: A method for determining correspondence between a gaze direction and an environment around a wearable device is disclosed. The wearable device may include an eye tracking device and an outward facing image sensor. The method may include receiving an input parameter and at least one scene image from the outward facing image sensor. The method may further include determining, with at least the eye tracking device, at least one gaze direction of a wearer of the wearable device at a point in time corresponding to when the scene image was captured by the outward facing image sensor. The method may additionally include determining, based at least in part on the input parameter, that a particular scene image includes at least a portion of a predefined image. The method may moreover include determining, based on the at least one gaze direction, at least one gaze point on the particular scene image.Type: ApplicationFiled: November 30, 2015Publication date: February 9, 2017Inventors: André Algotsson, Anders Clausen, Jesper Högström, Jonas Högström, Tobias Lindgren, Rasmus Petersson, Mårten Skogö, Wilkey Wong