Patents by Inventor Fredrik Lindh
Fredrik Lindh has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11709545Abstract: A method for determining if a user's gaze is directed in the direction of a zone of interest in a 3D scene comprises: providing a 3D scene containing a zone of interest; associating a property with the zone of interest; creating a bitmap representing the location of the zone of interest in a projected view of the 3D scene, each pixel of the bitmap to which the zone of interest is projected storing the property of the zone of interest; detecting the direction of the user's gaze; using the bitmap to determine if the detected user's gaze is directed in the direction of the zone of interest.Type: GrantFiled: January 27, 2020Date of Patent: July 25, 2023Assignee: Tobii ABInventors: Fredrik Lindh, Mattias Gustavsson, Anders Vennström, Andreas Edling
-
Patent number: 11688128Abstract: A method for determining a focus target of a user's gaze in a three-dimensional (“3D”) scene is disclosed. The method may include determining a first gaze direction of a user into a 3D scene, where the 3D scene includes a plurality of components. The method may also include executing a first plurality of line traces in the 3D scene, where each of the first plurality of Line traces is in proximity to the first gaze direction. The method may further include determining a confidence value for each component intersected by at least one of the first plurality of line traces. The method may additionally include identifying as a focus target of the user the component having the highest confidence value of all components intersected by at least one of the first plurality of line traces.Type: GrantFiled: January 11, 2022Date of Patent: June 27, 2023Assignee: Tobii ABInventor: Fredrik Lindh
-
Patent number: 11579687Abstract: A method for determining a current gaze direction of a user in relation to a three-dimensional (“3D”) scene, the 3D scene sampled by a rendering function to produce a two-dimensional (“2D”) projection image of the 3D scene, the sampling performed based on a virtual camera in turn being associated with a camera position and camera direction in the 3D scene. The method includes determining, by a gaze direction detection means, a first gaze direction of the user related to the 3D scene at a first gaze time point. The method includes determining a time-dependent virtual camera 3D transformation representing a change of a virtual camera position and/or virtual camera direction between the first gaze time point and a second sampling. The method includes determining the current gaze direction as a modified gaze direction calculated based on the first gaze direction and an inverse of the time-dependent virtual camera 3D transformation.Type: GrantFiled: February 4, 2020Date of Patent: February 14, 2023Assignee: Tobii ABInventor: Fredrik Lindh
-
Publication number: 20220237849Abstract: Method for reducing processor load in a system rendering a virtual scene to produce a rendered presentation of said virtual scene, which scene comprises at least one animated object, wherein the system performs said rendering based on said virtual scene which in turn is animated by the system based on a set of predefined animation rules, wherein the method comprises the steps: determining, based on information from a gaze direction detection means , a first zone or point of the virtual scene as a zone to which a gaze of the user is currently directed; determining a relative location or distance of a first object as a location in relation to said first zone or point; and modifying the value of an animation updating frequency of said first object per-formed by the system as a function of said determined relative location or distance. The invention also relates to a system and to a computer software function.Type: ApplicationFiled: June 29, 2020Publication date: July 28, 2022Applicant: Tobii ABInventor: Fredrik Lindh
-
Publication number: 20220130107Abstract: A method for determining a focus target of a user's gaze in a three-dimensional (“3D”) scene is disclosed. The method may include determining a first gaze direction of a user into a 3D scene, where the 3D scene includes a plurality of components. The method may also include executing a first plurality of line traces in the 3D scene, where each of the first plurality of Line traces is in proximity to the first gaze direction. The method may further include determining a confidence value for each component intersected by at least one of the first plurality of line traces. The method may additionally include identifying as a focus target of the user the component having the highest confidence value of all components intersected by at least one of the first plurality of line traces.Type: ApplicationFiled: January 11, 2022Publication date: April 28, 2022Applicant: Tobii ABInventor: Fredrik Lindh
-
Patent number: 11270499Abstract: A method for determining a focus target of a user's gaze in a three-dimensional (“3D”) scene is disclosed. The method may include determining a first gaze direction of a user into a 3D scene, where the 3D scene includes a plurality of components. The method may also include executing a first plurality of line traces in the 3D scene, where each of the first plurality of line traces is in proximity to the first gaze direction. The method may further include determining a confidence value for each component intersected by at least one of the first plurality of line traces. The method may additionally include identifying as a focus target of the user the component having the highest confidence value of all components intersected by at least one of the first plurality of line traces.Type: GrantFiled: February 20, 2020Date of Patent: March 8, 2022Assignee: Tobii ABInventor: Fredrik Lindh
-
Publication number: 20210011682Abstract: According to the invention, a method for providing audio to a user is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user on a display. The method may also include causing, with a computer system, an audio device to produce audio to the user, where content of the audio may be based at least in part on the gaze point of the user on the display.Type: ApplicationFiled: April 28, 2020Publication date: January 14, 2021Applicant: Tobii ABInventors: Anders Vennström, Fredrik Lindh
-
Publication number: 20210012559Abstract: A method for determining a focus target of a user's gaze in a three-dimensional (“3D”) scene is disclosed. The method may include determining a first gaze direction of a user into a 3D scene, where the 3D scene includes a plurality of components. The method may also include executing a first plurality of line traces in the 3D scene, where each of the first plurality of line traces is in proximity to the first gaze direction. The method may further include determining a confidence value for each component intersected by at least one of the first plurality of line traces. The method may additionally include identifying as a focus target of the user the component having the highest confidence value of all components intersected by at least one of the first plurality of line traces.Type: ApplicationFiled: February 20, 2020Publication date: January 14, 2021Applicant: Tobii ABInventor: Fredrik Lindh
-
Patent number: 10885882Abstract: According to the invention, a method for reducing aliasing artifacts in foveated rendering is disclosed. The method may include accessing a high resolution image and a low resolution image corresponding to the high resolution image, and calculating a difference between a pixel of the high resolution image and a sample associated with the low resolution image. The sample of the low resolution image corresponds to the pixel of the high resolution image. The method may further include modifying the pixel to generate a modified pixel of the high resolution image based on determining that the difference is higher than or equal to a threshold value. The modification may be made such that an updated difference between the modified pixel and the sample is smaller than the original difference.Type: GrantFiled: December 6, 2018Date of Patent: January 5, 2021Assignee: Tobii ABInventors: Daan Pieter Nijs, Fredrik Lindh
-
Publication number: 20200285311Abstract: A method for determining if a user's gaze is directed in the direction of a zone of interest in a 3D scene comprises: providing a 3D scene containing a zone of interest; associating a property with the zone of interest; creating a bitmap representing the location of the zone of interest in a projected view of the 3D scene, each pixel of the bitmap to which the zone of interest is projected storing the property of the zone of interest; detecting the direction of the user's gaze; using the bitmap to determine if the detected user's gaze is directed in the direction of the zone of interest.Type: ApplicationFiled: January 27, 2020Publication date: September 10, 2020Applicant: Tobii ABInventors: Fredrik Lindh, Mattias Gustavsson, Anders Vennstrom, Andreas Edling
-
Publication number: 20200278746Abstract: Method for determining a current gaze direction of a user in relation to a three-dimensional (“3D”) scene, which 3D scene is sampled by a rendering function to produce a two-dimensional (“2D”) projection image of the 3D scene, which sampling is performed based on a virtual camera in turn being associated with a camera position and camera direction in the 3D scene, wherein the method comprises the steps: determining, by a gaze direction detection means, a first gaze direction of the user at a first gaze time point, which first gaze direction is related to said 3D scene; determining a virtual camera 3D transformation, which 3D transformation represents a change of a virtual camera position and/or virtual camera direction between the first gaze time point and a second sampling time point, where the second sampling time point is later than the first gaze time point; and determining the said current gaze direction as a modified gaze direction, in turn calculated based on the first gaze direction, and further calcType: ApplicationFiled: February 4, 2020Publication date: September 3, 2020Applicant: Tobii ABInventor: Fredrik Lindh
-
Publication number: 20200192625Abstract: According to the invention, a method for providing audio to a user is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user on a display. The method may also include causing, with a computer system, an audio device to produce audio to the user, where content of the audio may be based at least in part on the gaze point of the user on the display.Type: ApplicationFiled: October 8, 2019Publication date: June 18, 2020Applicant: Tobii ABInventors: Anders Vennström, Fredrik Lindh
-
Publication number: 20200184933Abstract: According to the invention, a method for reducing aliasing artifacts in foveated rendering is disclosed. The method may include accessing a high resolution image and a low resolution image corresponding to the high resolution image, and calculating a difference between a pixel of the high resolution image and a sample associated with the low resolution image. The sample of the low resolution image corresponds to the pixel of the high resolution image. The method may further include modifying the pixel to generate a modified pixel of the high resolution image based on determining that the difference is higher than or equal to a threshold value. The modification may be made such that an updated difference between the modified pixel and the sample is smaller than the original difference.Type: ApplicationFiled: December 6, 2018Publication date: June 11, 2020Applicant: Tobii ABInventors: Daan Pieter Nijs, Fredrik Lindh
-
Patent number: 10635386Abstract: According to the invention, a method for providing audio to a user is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user on a display. The method may also include causing, with a computer system, an audio device to produce audio to the user, where content of the audio may be based at least in part on the gaze point of the user on the display.Type: GrantFiled: August 21, 2017Date of Patent: April 28, 2020Assignee: TOBII ABInventors: Anders Vennström, Fredrik Lindh
-
Patent number: 10607401Abstract: A method for determining a focus target of a user's gaze in a three-dimensional (“3D”) scene is disclosed. The method may include determining a first gaze direction of a user into a 3D scene, where the 3D scene includes a plurality of components. The method may also include executing a first plurality of line traces in the 3D scene, where each of the first plurality of line traces is in proximity to the first gaze direction. The method may further include determining a confidence value for each component intersected by at least one of the first plurality of line traces. The method may additionally include identifying as a focus target of the user the component having the highest confidence value of all components intersected by at least one of the first plurality of line traces.Type: GrantFiled: March 30, 2018Date of Patent: March 31, 2020Assignee: Tobii ABInventor: Fredrik Lindh
-
Patent number: 10579142Abstract: A method for determining if a user's gaze is directed in the direction of a zone of interest in a 3D scene comprises: providing a 3D scene containing a zone of interest; associating a property with the zone of interest; creating a bitmap representing the location of the zone of interest in a projected view of the 3D scene, each pixel of the bitmap to which the zone of interest is projected storing the property of the zone of interest; detecting the direction of the user's gaze; using the bitmap to determine if the detected user's gaze is directed in the direction of the zone of interest.Type: GrantFiled: January 24, 2019Date of Patent: March 3, 2020Assignee: Tobii ABInventors: Fredrik Lindh, Mattias Gustavsson, Anders Vennstrom, Andreas Edling
-
Patent number: 10430150Abstract: According to the invention, a method for changing the behavior of computer program elements is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user. The method may also include causing, with a computer system, an interactive event controlled by the computer system to alter its behavior based at least in part on the gaze point of the user.Type: GrantFiled: August 25, 2014Date of Patent: October 1, 2019Assignee: Tobii ABInventors: Fredrik Lindh, Anders Vennström, Anders Olsson
-
Patent number: 10380419Abstract: A method for panning content on a display of a wearable device is disclosed. The method may include determining, via an eye tracking device, a gaze direction of a user. The method may also include determining, via a movement detection system, a head direction of the user. The method may further include, based at least in part on the gaze direction and the head direction both being consistent with a particular direction, causing content displayed on a display of the wearable device to be panned in the particular direction. The method may additionally include determining during panning of the content, via the eye tracking device, that the gaze direction of the user has returned to a neutral position. The method may moreover include, based at least in part on the gaze direction of the user returning to the neutral position, causing content displayed on the display to stop panning.Type: GrantFiled: October 16, 2018Date of Patent: August 13, 2019Assignee: Tobii ABInventors: Simon Gustafsson, Henrik Björk, Fredrik Lindh, Anders Olsson
-
Patent number: 10346128Abstract: According to the invention, a method for providing audio to a user is disclosed. The method may include determining, with an eye tracking device, a gaze point of a user on a display. The method may also include causing, with a computer system, an audio device to produce audio to the user, where content of the audio may be based at least in part on the gaze point of the user on the display.Type: GrantFiled: July 10, 2018Date of Patent: July 9, 2019Assignee: Tobii ABInventors: Anders Vennström, Fredrik Lindh
-
Publication number: 20190155383Abstract: A method for determining if a user's gaze is directed in the direction of a zone of interest in a 3D scene comprises: providing a 3D scene containing a zone of interest; associating a property with the zone of interest; creating a bitmap representing the location of the zone of interest in a projected view of the 3D scene, each pixel of the bitmap to which the zone of interest is projected storing the property of the zone of interest; detecting the direction of the user's gaze; using the bitmap to determine if the detected user's gaze is directed in the direction of the zone of interest.Type: ApplicationFiled: January 24, 2019Publication date: May 23, 2019Inventors: Fredrik Lindh, Mattias Gustavsson, Anders Vennstrom, Andreas Edling