Patents by Inventor Javier San Agustin

Javier San Agustin has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9851791
    Abstract: A user of a computing device may interact with and control objects and applications displayed on the computing device through the user's eye movement. Detected gaze locations are correlated with actions performed by the user and compared with typical gaze locations for those actions. Based on differences between the detected and expected gaze locations, the eye tracking system can be recalibrated. An area around a gaze location encompassing a set of likely active locations can be enlarged, effectively prompting the user to interact with the desired active location again. The enlarging of the area serves to separate the active locations on the screen, reducing the probability of interpreting the user's gaze incorrectly.
    Type: Grant
    Filed: November 13, 2015
    Date of Patent: December 26, 2017
    Assignee: Facebook, Inc.
    Inventors: Javier San Agustin Lopez, Martin Henrik Tall, Benjamin Antoine Georges Lefaudeux, Henrik Hegner Tomra Skovsgaard
  • Patent number: 9836639
    Abstract: An image of a user's eyes and face may be analyzed using computer-vision algorithms. A computing device may use the image to determine the location of the user's eyes and estimate the direction in which the user is looking. The eye tracking technology may be used in a wide range of lighting conditions and with many different and varying light levels. When a user is near a light source, an automatic exposure feature in the camera may result in the user's face and eyes appearing too dark in the image, possibly reducing the likelihood of face and eye detection. Adjusting attributes such as the camera exposure time and the intensity and illumination interval of the light sources based on motion and light levels may improve detection of a user's features.
    Type: Grant
    Filed: January 9, 2015
    Date of Patent: December 5, 2017
    Assignee: Facebook, Inc.
    Inventors: Sebastian Sztuk, Martin Henrik Tall, Javier San Agustin Lopez
  • Patent number: 9829971
    Abstract: Methods and systems to facilitate eye tracking control are provided. A user input is received at a computing device. Point of regard information associated with a user of the computing device is determined while the user input is being received. The point of regard information indicates a location on a display of the computing device at which the user is looking. An operation associated with a display object identified based on the point of regard information is performed when receipt of the user input is determined to have terminated.
    Type: Grant
    Filed: January 20, 2014
    Date of Patent: November 28, 2017
    Assignee: Facebook, Inc.
    Inventors: Javier San Agustin Lopez, Sebastian Sztuk, Martin Henrik Tall
  • Patent number: 9798382
    Abstract: Methods and systems to facilitate eye tracking data analysis are provided. Point of regard information from a first client device of a first user is received, where the point of regard information is determined by the first client device by detecting one or more eye features associated with an eye of the first user. The point of regard information is stored. A request to access the point of regard information is received, and the point of regard information is sent in response to the request, where the point of regard information is used in a subsequent operation.
    Type: Grant
    Filed: April 16, 2014
    Date of Patent: October 24, 2017
    Assignee: Facebook, Inc.
    Inventors: Sune Alstrup Johansen, Henrik Hegner Tomra Skovsgaard, Martin Henrik Tall, Javier San Agustin Lopez, Sebastian Sztuk
  • Patent number: 9791927
    Abstract: Methods and systems to facilitate eye tracking control calibration are provided. One or more objects are displayed on a display of a device, where the one or more objects are associated with a function unrelated to a calculation of one or more calibration parameters. The one or more calibration parameters relate to a calibration of a calculation of gaze information of a user of the device, where the gaze information indicates where the user is looking. While the one or more objects are displayed, eye movement information associated with the user is determined, which indicates eye movement of one or more eye features associated with at least one eye of the user. The eye movement information is associated with a first object location of the one or more objects. The one or more calibration parameters are calculated based on the first object location being associated with the eye movement information.
    Type: Grant
    Filed: April 10, 2017
    Date of Patent: October 17, 2017
    Assignee: Facebook, Inc.
    Inventors: Javier San Agustin Lopez, John Paulin Hansen, Sebastian Sztuk, Martin Henrik Tall
  • Patent number: 9785233
    Abstract: An image of a user's eyes and/or face, captured by a camera on the computing device or on a device coupled to the computing device, may be analyzed using computer-vision algorithms, such as eye tracking and gaze detection algorithms, to determine the location of the user's eyes and estimate the gaze information associated with the user. A user calibration process may be conducted to calculate calibration parameters associated with the user. These calibration parameters may be taken into account to accurately determine the location of the user's eyes and estimate the location on the display at which the user is looking. The calibration process may include determining a plane on which the user's eyes converge and relating that plane to a plane of a screen on which calibration targets are displayed.
    Type: Grant
    Filed: April 10, 2015
    Date of Patent: October 10, 2017
    Assignee: Facebook, Inc.
    Inventors: Javier San Agustin Lopez, Benjamin Antoine Georges Lefaudeux
  • Publication number: 20170272768
    Abstract: Systems and methods for reducing, with minimal loss, optical sensor data to be conveyed to another system for processing. An eye tracking device, such as a head-mounted display (HMD), includes a sensor and circuitry. The sensor generates image data of an eye. The circuitry receives the image data, and assigns pixels of the image data to a feature region of the eye by comparing pixel values of the pixels to a threshold value. A feature region refers to an eye region of interest for eye tracking, such as a pupil or glint. The circuitry generates encoded image data by apply an encoding algorithm, such as a run-length encoding algorithm or contour encoding algorithm, to the image data for the pixels of the feature region. The circuitry transmits the encoded image data, having a smaller data size than the image data received from the sensor, for gaze contingent content rendering.
    Type: Application
    Filed: March 16, 2017
    Publication date: September 21, 2017
    Inventors: Martin Henrik Tall, Sune Loje, Javier San Agustin Lopez
  • Publication number: 20170262051
    Abstract: The invention is a method for combining eye tracking and voice-recognition control technologies to increase the speed and/or accuracy of locating and selecting objects displayed on a display screen for subsequent control and operations.
    Type: Application
    Filed: March 10, 2016
    Publication date: September 14, 2017
    Applicant: The Eye Tribe
    Inventors: Martin Henrik Tall, Jonas Priesum, Javier San Agustin
  • Publication number: 20170212586
    Abstract: Methods and systems to facilitate eye tracking control calibration are provided. One or more objects are displayed on a display of a device, where the one or more objects are associated with a function unrelated to a calculation of one or more calibration parameters. The one or more calibration parameters relate to a calibration of a calculation of gaze information of a user of the device, where the gaze information indicates where the user is looking. While the one or more objects are displayed, eye movement information associated with the user is determined, which indicates eye movement of one or more eye features associated with at least one eye of the user. The eye movement information is associated with a first object location of the one or more objects. The one or more calibration parameters are calculated based on the first object location being associated with the eye movement information.
    Type: Application
    Filed: April 10, 2017
    Publication date: July 27, 2017
    Inventors: Javier San Agustin Lopez, John Paulin Hansen, Sebastian Sztuk, Martin Henrik Tall
  • Patent number: 9693684
    Abstract: Methods and systems to facilitate eye tracking control calibration are provided. One or more objects are displayed on a display of a device, where the one or more objects are associated with a function unrelated to a calculation of one or more calibration parameters. The one or more calibration parameters relate to a calibration of a calculation of gaze information of a user of the device, where the gaze information indicates where the user is looking. While the one or more objects are displayed, eye movement information associated with the user is determined, which indicates eye movement of one or more eye features associated with at least one eye of the user. The eye movement information is associated with a first object location of the one or more objects. The one or more calibration parameters are calculated based on the first object location being associated with the eye movement information.
    Type: Grant
    Filed: February 14, 2014
    Date of Patent: July 4, 2017
    Assignee: Facebook, Inc.
    Inventors: Javier San Agustin Lopez, John Paulin Hansen, Sebastian Sztuk, Martin Henrik Tall
  • Publication number: 20170177081
    Abstract: Methods and systems to facilitate eye tracking control on mobile devices are provided. An image of a portion of a user is received at an eye tracking device, where the image includes reflections caused by light emitted on the user from one or more light sources located within the eye tracking device. One or more eye features associated with an eye of the user is detected using the reflections. Point of regard information is determined using the one or more eye features, where the point of regard information indicates a location on a display of a computing device coupled to the eye tracking device at which the user was looking when the image of the portion of the user was taken. The point of regard information is sent to an application capable of performing a subsequent operation using the point of regard information.
    Type: Application
    Filed: March 8, 2017
    Publication date: June 22, 2017
    Inventors: Sebastian Sztuk, Martin Henrik Tall, Javier San Agustin Lopez
  • Patent number: 9612656
    Abstract: Methods and systems to facilitate eye tracking control on mobile devices are provided. An image of a portion of a user is received at an eye tracking device, where the image includes reflections caused by light emitted on the user from one or more light sources located within the eye tracking device. One or more eye features associated with an eye of the user is detected using the reflections. Point of regard information is determined using the one or more eye features, where the point of regard information indicates a location on a display of a computing device coupled to the eye tracking device at which the user was looking when the image of the portion of the user was taken. The point of regard information is sent to an application capable of performing a subsequent operation using the point of regard information.
    Type: Grant
    Filed: November 25, 2013
    Date of Patent: April 4, 2017
    Assignee: Facebook, Inc.
    Inventors: Sebastian Sztuk, Martin Henrik Tall, Javier San Agustin Lopez
  • Publication number: 20170083695
    Abstract: The invention is a method for authenticating a system user based on eye tracking or eye parameters.
    Type: Application
    Filed: October 6, 2016
    Publication date: March 23, 2017
    Applicant: The Eye Tribe
    Inventors: Javier San Agustin, Jonas Philip Priesum
  • Publication number: 20160274762
    Abstract: An augmented reality (AR) device can access a library of applications or user interfaces (UIs) designed to control a set of devices. The AR device can determine which UI to present based on detection of a device to be controlled near the AR device. For example, a user wearing an AR device may look at a thermostat placed on a wall and a UI to control the thermostat may be presented to the user. The determination that the user is looking at the thermostat may be made by correlating the gaze tracking information of the user-facing camera with the location of the thermostat in an image captured by a world-facing camera. Determination of the location of the thermostat in the image can be performed using image recognition technology. The UI can be selected based on a database record pairing the UI with the thermostat.
    Type: Application
    Filed: March 16, 2016
    Publication date: September 22, 2016
    Inventors: Javier San Agustin Lopez, Martin Henrik Tall, Rasmus Dahl, Jonas Priesum
  • Publication number: 20160248971
    Abstract: Eye tracking technology may be used in a wide range of lighting conditions and with many different and varying light levels. In some embodiments, an eye tracking device may employ active illumination (e.g., in the form of infrared light-emitting diodes (LEDs)). However, employing active illumination may reduce the battery life of the device. Under some circumstances (e.g., in a dark environment), the light intensity may be excessive and could be reduced, thereby reducing energy consumption and extending the battery life of the device. An algorithm may be used to adjust the duration of light in eye tracking systems that employ active illumination.
    Type: Application
    Filed: February 23, 2016
    Publication date: August 25, 2016
    Inventors: Martin Henrik Tall, Sebastian Sztuk, Javier San Agustin Lopez, Rasmus Dahl
  • Publication number: 20160231812
    Abstract: A mobile gaze-tracking system is provided. The user operates the system by looking at the gaze tracking unit and at pre-defined regions at the fringe of the tracking unit. The gaze tracking unit may be placed on a smartwatch, a wristband, or woven into a sleeve of a garment. The unit provides feedback to the user in response to the received command input. The unit provides feedback to the user on how to position the mobile unit in front of his eyes. The gaze tracking unit interacts with one or more controlled devices via wireless or wired communications. Example devices include a lock, a thermostat, a light or a TV. The connection between the gaze tracking unit may be temporary or longer-lasting. The gaze tracking unit may detect features of the eye that provide information about the identity of the user.
    Type: Application
    Filed: February 8, 2016
    Publication date: August 11, 2016
    Inventors: John Paulin Hansen, Sebastian Sztuk, Javier San Agustin Lopez
  • Publication number: 20160195927
    Abstract: A user of a computing device may interact with the computing device through the user's eye movement. An image of the user's eyes or face, captured by a camera on the computing device, may be analyzed using computer-vision algorithms, such as eye tracking and gaze detection algorithms. During use of the computing device, one or more lights illuminating the user, or cameras viewing the user, may become blocked. The device may be equipped with more lights or cameras than are necessary to perform gaze detection by the device. In an over-equipped device, the additional lights or cameras can remain dormant until a blockage is detected. In response to a camera or light becoming blocked, a dormant light or camera can be activated.
    Type: Application
    Filed: January 7, 2016
    Publication date: July 7, 2016
    Inventor: Javier San Agustin Lopez
  • Publication number: 20160139665
    Abstract: A user of a computing device may interact with and control objects and applications displayed on the computing device through the user's eye movement. Detected gaze locations are correlated with actions performed by the user and compared with typical gaze locations for those actions. Based on differences between the detected and expected gaze locations, the eye tracking system can be recalibrated. An area around a gaze location encompassing a set of likely active locations can be enlarged, effectively prompting the user to interact with the desired active location again. The enlarging of the area serves to separate the active locations on the screen, reducing the probability of interpreting the user's gaze incorrectly.
    Type: Application
    Filed: November 13, 2015
    Publication date: May 19, 2016
    Inventors: Javier San Agustin Lopez, Martin Henrik Tall, Benjamin Antoine Georges Lefaudeux, Henrik Hegner Tomra Skovsgaard
  • Publication number: 20160085301
    Abstract: Gaze information of a user can be determined by a computing device that analyzes images of the user. Gaze information of a user includes information such as the user's line of sight, point of regard information, the direction of the user's gaze, the depth of convergence of the user's gaze, and the like. The computing device is able to estimate the distance from the user at which the user is focusing (for example, at a screen near the user or at an object farther away). The visibility and display characteristics of objects displayed on the HUD may be based on the gaze information. For example, content on a heads-up display (HUD) on a windshield may be more transparent while the user is looking through the windshield and more opaque (or otherwise enhanced) while the user is focusing on the HUD.
    Type: Application
    Filed: September 21, 2015
    Publication date: March 24, 2016
    Inventor: Javier San Agustin Lopez
  • Publication number: 20160011658
    Abstract: An image of a user's eyes and/or face, captured by a camera on the computing device or on a device coupled to the computing device, may be analyzed using computer-vision algorithms, such as eye tracking and gaze detection algorithms, to determine the location of the user's eyes and estimate the gaze information associated with the user. A user calibration process may be conducted to calculate calibration parameters associated with the user. These calibration parameters may be taken into account to accurately determine the location of the user's eyes and estimate the location on the display at which the user is looking. The calibration process may include determining a plane on which the user's eyes converge and relating that plane to a plane of a screen on which calibration targets are displayed.
    Type: Application
    Filed: April 10, 2015
    Publication date: January 14, 2016
    Inventors: Javier San Agustin Lopez, Benjamin Antoine Georges Lefaudeux