Patents by Inventor Martin Henrik Tall
Martin Henrik Tall has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20180046248Abstract: Methods and systems to facilitate eye tracking control are provided. A user input is received at a computing device. Point of regard information associated with a user of the computing device is determined while the user input is being received. The point of regard information indicates a location on a display of the computing device at which the user is looking. An operation associated with a display object identified based on the point of regard information is performed when receipt of the user input is determined to have terminated.Type: ApplicationFiled: October 24, 2017Publication date: February 15, 2018Inventors: Javier San Agustin Lopez, Sebastian Sztuk, Martin Henrik Tall
-
Publication number: 20180033405Abstract: A display system divides a screen into regions and applies a different set of rendering/encoding parameters to each region. The system applies a first set of parameters to a first region that is being viewed by a fovea of an eye of a user. The system may also apply a second set of parameters to a second region that is being viewed by a parafovea of the eye, and apply a third set of parameters to a third region that is being viewed by the area of the eye outside of the parafovea. The first set of parameters are selected to yield relatively high image quality, while the second set of parameters are yield intermediate quality, and the third set of parameters yield lower quality. As a result, the second region and the third region can be rendered, encoded, and transmitted with less computing power and less bandwidth.Type: ApplicationFiled: July 28, 2017Publication date: February 1, 2018Inventors: Martin Henrik Tall, Javier San Agustin Lopez, Rasmus Dahl
-
Patent number: 9851791Abstract: A user of a computing device may interact with and control objects and applications displayed on the computing device through the user's eye movement. Detected gaze locations are correlated with actions performed by the user and compared with typical gaze locations for those actions. Based on differences between the detected and expected gaze locations, the eye tracking system can be recalibrated. An area around a gaze location encompassing a set of likely active locations can be enlarged, effectively prompting the user to interact with the desired active location again. The enlarging of the area serves to separate the active locations on the screen, reducing the probability of interpreting the user's gaze incorrectly.Type: GrantFiled: November 13, 2015Date of Patent: December 26, 2017Assignee: Facebook, Inc.Inventors: Javier San Agustin Lopez, Martin Henrik Tall, Benjamin Antoine Georges Lefaudeux, Henrik Hegner Tomra Skovsgaard
-
Patent number: 9836639Abstract: An image of a user's eyes and face may be analyzed using computer-vision algorithms. A computing device may use the image to determine the location of the user's eyes and estimate the direction in which the user is looking. The eye tracking technology may be used in a wide range of lighting conditions and with many different and varying light levels. When a user is near a light source, an automatic exposure feature in the camera may result in the user's face and eyes appearing too dark in the image, possibly reducing the likelihood of face and eye detection. Adjusting attributes such as the camera exposure time and the intensity and illumination interval of the light sources based on motion and light levels may improve detection of a user's features.Type: GrantFiled: January 9, 2015Date of Patent: December 5, 2017Assignee: Facebook, Inc.Inventors: Sebastian Sztuk, Martin Henrik Tall, Javier San Agustin Lopez
-
Patent number: 9829971Abstract: Methods and systems to facilitate eye tracking control are provided. A user input is received at a computing device. Point of regard information associated with a user of the computing device is determined while the user input is being received. The point of regard information indicates a location on a display of the computing device at which the user is looking. An operation associated with a display object identified based on the point of regard information is performed when receipt of the user input is determined to have terminated.Type: GrantFiled: January 20, 2014Date of Patent: November 28, 2017Assignee: Facebook, Inc.Inventors: Javier San Agustin Lopez, Sebastian Sztuk, Martin Henrik Tall
-
Patent number: 9798382Abstract: Methods and systems to facilitate eye tracking data analysis are provided. Point of regard information from a first client device of a first user is received, where the point of regard information is determined by the first client device by detecting one or more eye features associated with an eye of the first user. The point of regard information is stored. A request to access the point of regard information is received, and the point of regard information is sent in response to the request, where the point of regard information is used in a subsequent operation.Type: GrantFiled: April 16, 2014Date of Patent: October 24, 2017Assignee: Facebook, Inc.Inventors: Sune Alstrup Johansen, Henrik Hegner Tomra Skovsgaard, Martin Henrik Tall, Javier San Agustin Lopez, Sebastian Sztuk
-
Patent number: 9791927Abstract: Methods and systems to facilitate eye tracking control calibration are provided. One or more objects are displayed on a display of a device, where the one or more objects are associated with a function unrelated to a calculation of one or more calibration parameters. The one or more calibration parameters relate to a calibration of a calculation of gaze information of a user of the device, where the gaze information indicates where the user is looking. While the one or more objects are displayed, eye movement information associated with the user is determined, which indicates eye movement of one or more eye features associated with at least one eye of the user. The eye movement information is associated with a first object location of the one or more objects. The one or more calibration parameters are calculated based on the first object location being associated with the eye movement information.Type: GrantFiled: April 10, 2017Date of Patent: October 17, 2017Assignee: Facebook, Inc.Inventors: Javier San Agustin Lopez, John Paulin Hansen, Sebastian Sztuk, Martin Henrik Tall
-
Publication number: 20170272768Abstract: Systems and methods for reducing, with minimal loss, optical sensor data to be conveyed to another system for processing. An eye tracking device, such as a head-mounted display (HMD), includes a sensor and circuitry. The sensor generates image data of an eye. The circuitry receives the image data, and assigns pixels of the image data to a feature region of the eye by comparing pixel values of the pixels to a threshold value. A feature region refers to an eye region of interest for eye tracking, such as a pupil or glint. The circuitry generates encoded image data by apply an encoding algorithm, such as a run-length encoding algorithm or contour encoding algorithm, to the image data for the pixels of the feature region. The circuitry transmits the encoded image data, having a smaller data size than the image data received from the sensor, for gaze contingent content rendering.Type: ApplicationFiled: March 16, 2017Publication date: September 21, 2017Inventors: Martin Henrik Tall, Sune Loje, Javier San Agustin Lopez
-
Publication number: 20170262051Abstract: The invention is a method for combining eye tracking and voice-recognition control technologies to increase the speed and/or accuracy of locating and selecting objects displayed on a display screen for subsequent control and operations.Type: ApplicationFiled: March 10, 2016Publication date: September 14, 2017Applicant: The Eye TribeInventors: Martin Henrik Tall, Jonas Priesum, Javier San Agustin
-
Publication number: 20170212586Abstract: Methods and systems to facilitate eye tracking control calibration are provided. One or more objects are displayed on a display of a device, where the one or more objects are associated with a function unrelated to a calculation of one or more calibration parameters. The one or more calibration parameters relate to a calibration of a calculation of gaze information of a user of the device, where the gaze information indicates where the user is looking. While the one or more objects are displayed, eye movement information associated with the user is determined, which indicates eye movement of one or more eye features associated with at least one eye of the user. The eye movement information is associated with a first object location of the one or more objects. The one or more calibration parameters are calculated based on the first object location being associated with the eye movement information.Type: ApplicationFiled: April 10, 2017Publication date: July 27, 2017Inventors: Javier San Agustin Lopez, John Paulin Hansen, Sebastian Sztuk, Martin Henrik Tall
-
Patent number: 9693684Abstract: Methods and systems to facilitate eye tracking control calibration are provided. One or more objects are displayed on a display of a device, where the one or more objects are associated with a function unrelated to a calculation of one or more calibration parameters. The one or more calibration parameters relate to a calibration of a calculation of gaze information of a user of the device, where the gaze information indicates where the user is looking. While the one or more objects are displayed, eye movement information associated with the user is determined, which indicates eye movement of one or more eye features associated with at least one eye of the user. The eye movement information is associated with a first object location of the one or more objects. The one or more calibration parameters are calculated based on the first object location being associated with the eye movement information.Type: GrantFiled: February 14, 2014Date of Patent: July 4, 2017Assignee: Facebook, Inc.Inventors: Javier San Agustin Lopez, John Paulin Hansen, Sebastian Sztuk, Martin Henrik Tall
-
Publication number: 20170177081Abstract: Methods and systems to facilitate eye tracking control on mobile devices are provided. An image of a portion of a user is received at an eye tracking device, where the image includes reflections caused by light emitted on the user from one or more light sources located within the eye tracking device. One or more eye features associated with an eye of the user is detected using the reflections. Point of regard information is determined using the one or more eye features, where the point of regard information indicates a location on a display of a computing device coupled to the eye tracking device at which the user was looking when the image of the portion of the user was taken. The point of regard information is sent to an application capable of performing a subsequent operation using the point of regard information.Type: ApplicationFiled: March 8, 2017Publication date: June 22, 2017Inventors: Sebastian Sztuk, Martin Henrik Tall, Javier San Agustin Lopez
-
Patent number: 9612656Abstract: Methods and systems to facilitate eye tracking control on mobile devices are provided. An image of a portion of a user is received at an eye tracking device, where the image includes reflections caused by light emitted on the user from one or more light sources located within the eye tracking device. One or more eye features associated with an eye of the user is detected using the reflections. Point of regard information is determined using the one or more eye features, where the point of regard information indicates a location on a display of a computing device coupled to the eye tracking device at which the user was looking when the image of the portion of the user was taken. The point of regard information is sent to an application capable of performing a subsequent operation using the point of regard information.Type: GrantFiled: November 25, 2013Date of Patent: April 4, 2017Assignee: Facebook, Inc.Inventors: Sebastian Sztuk, Martin Henrik Tall, Javier San Agustin Lopez
-
Publication number: 20160274762Abstract: An augmented reality (AR) device can access a library of applications or user interfaces (UIs) designed to control a set of devices. The AR device can determine which UI to present based on detection of a device to be controlled near the AR device. For example, a user wearing an AR device may look at a thermostat placed on a wall and a UI to control the thermostat may be presented to the user. The determination that the user is looking at the thermostat may be made by correlating the gaze tracking information of the user-facing camera with the location of the thermostat in an image captured by a world-facing camera. Determination of the location of the thermostat in the image can be performed using image recognition technology. The UI can be selected based on a database record pairing the UI with the thermostat.Type: ApplicationFiled: March 16, 2016Publication date: September 22, 2016Inventors: Javier San Agustin Lopez, Martin Henrik Tall, Rasmus Dahl, Jonas Priesum
-
Publication number: 20160248971Abstract: Eye tracking technology may be used in a wide range of lighting conditions and with many different and varying light levels. In some embodiments, an eye tracking device may employ active illumination (e.g., in the form of infrared light-emitting diodes (LEDs)). However, employing active illumination may reduce the battery life of the device. Under some circumstances (e.g., in a dark environment), the light intensity may be excessive and could be reduced, thereby reducing energy consumption and extending the battery life of the device. An algorithm may be used to adjust the duration of light in eye tracking systems that employ active illumination.Type: ApplicationFiled: February 23, 2016Publication date: August 25, 2016Inventors: Martin Henrik Tall, Sebastian Sztuk, Javier San Agustin Lopez, Rasmus Dahl
-
Publication number: 20160139665Abstract: A user of a computing device may interact with and control objects and applications displayed on the computing device through the user's eye movement. Detected gaze locations are correlated with actions performed by the user and compared with typical gaze locations for those actions. Based on differences between the detected and expected gaze locations, the eye tracking system can be recalibrated. An area around a gaze location encompassing a set of likely active locations can be enlarged, effectively prompting the user to interact with the desired active location again. The enlarging of the area serves to separate the active locations on the screen, reducing the probability of interpreting the user's gaze incorrectly.Type: ApplicationFiled: November 13, 2015Publication date: May 19, 2016Inventors: Javier San Agustin Lopez, Martin Henrik Tall, Benjamin Antoine Georges Lefaudeux, Henrik Hegner Tomra Skovsgaard
-
Publication number: 20150199559Abstract: An image of a user's eyes and face may be analyzed using computer-vision algorithms. A computing device may use the image to determine the location of the user's eyes and estimate the direction in which the user is looking. The eye tracking technology may be used in a wide range of lighting conditions and with many different and varying light levels. When a user is near a light source, an automatic exposure feature in the camera may result in the user's face and eyes appearing too dark in the image, possibly reducing the likelihood of face and eye detection. Adjusting attributes such as the camera exposure time and the intensity and illumination interval of the light sources based on motion and light levels may improve detection of a user's features.Type: ApplicationFiled: January 9, 2015Publication date: July 16, 2015Inventors: Sebastian Sztuk, Martin Henrik Tall, Javier San Agustin Lopez
-
Publication number: 20140306882Abstract: Methods and systems to facilitate eye tracking data analysis are provided. Point of regard information from a first client device of a first user is received, where the point of regard information is determined by the first client device by detecting one or more eye features associated with an eye of the first user. The point of regard information is stored. A request to access the point of regard information is received, and the point of regard information is sent in response to the request, where the point of regard information is used in a subsequent operation.Type: ApplicationFiled: April 16, 2014Publication date: October 16, 2014Applicant: The Eye Tribe ApsInventors: Sune Alstrup JOHANSEN, Henrik Hegner Tomra Skovsgaard, Martin Henrik Tall, Javier San agustin Lopez, Sebastian Sztuk
-
Publication number: 20140226131Abstract: Methods and systems to facilitate eye tracking control calibration are provided. One or more objects are displayed on a display of a device, where the one or more objects are associated with a function unrelated to a calculation of one or more calibration parameters. The one or more calibration parameters relate to a calibration of a calculation of gaze information of a user of the device, where the gaze information indicates where the user is looking. While the one or more objects are displayed, eye movement information associated with the user is determined, which indicates eye movement of one or more eye features associated with at least one eye of the user. The eye movement information is associated with a first object location of the one or more objects. The one or more calibration parameters are calculated based on the first object location being associated with the eye movement information.Type: ApplicationFiled: February 14, 2014Publication date: August 14, 2014Applicant: The Eye Tribe ApsInventors: Javier San Agustin Lopez, John Paulin Hansen, Sebastian Sztuk, Martin Henrik Tall
-
Publication number: 20140204029Abstract: Methods and systems to facilitate eye tracking control are provided. A user input is received at a computing device. Point of regard information associated with a user of the computing device is determined while the user input is being received. The point of regard information indicates a location on a display of the computing device at which the user is looking. An operation associated with a display object identified based on the point of regard information is performed when receipt of the user input is determined to have terminated.Type: ApplicationFiled: January 20, 2014Publication date: July 24, 2014Applicant: The Eye Tribe ApsInventors: Javier San Agustin Lopez, Sebastian Sztuk, Martin Henrik Tall