Patents by Inventor Martin Henrik Tall

Martin Henrik Tall has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11321838
    Abstract: In one embodiment, a method for eye-tracking comprises capturing images of a user using one or more cameras, the captured images of the user depicting at least an eye of the user, storing the captured images of the user in a storage device, reading, from the storage device, a down-sampled version of the captured images of the user, detecting one or more first segments in the down-sampled version of the captured images by processing the down-sampled version of the captured images using a machine-learning model, the one or more first segments comprising features of the eye of the user, reading, from the storage device, one or more second segments in the captured images corresponding to the one or more first segments in the down-sampled version of the captured images, and computing a gaze of the user based on the one or more second segments in the captured images.
    Type: Grant
    Filed: August 31, 2020
    Date of Patent: May 3, 2022
    Assignee: Facebook Technologies, LLC.
    Inventors: Jeffrey Hung Wong, Martin Henrik Tall, Jixu Chen, Kapil Krishnakumar
  • Publication number: 20220067924
    Abstract: In one embodiment, a method for eye-tracking comprises capturing images of a user using one or more cameras, the captured images of the user depicting at least an eye of the user, storing the captured images of the user in a storage device, reading, from the storage device, a down-sampled version of the captured images of the user, detecting one or more first segments in the down-sampled version of the captured images by processing the down-sampled version of the captured images using a machine-learning model, the one or more first segments comprising features of the eye of the user, reading, from the storage device, one or more second segments in the captured images corresponding to the one or more first segments in the down-sampled version of the captured images, and computing a gaze of the user based on the one or more second segments in the captured images.
    Type: Application
    Filed: August 31, 2020
    Publication date: March 3, 2022
    Inventors: Jeffrey Hung Wong, Martin Henrik Tall, Jixu Chen, Kapil Krishnakumar
  • Patent number: 11188148
    Abstract: A virtual reality (VR) system includes a console an imaging device, a head mounted display (HMD) and a user input device. The console includes a virtual reality (VR) engine, a tracking module and an application store. The HMD includes a display element configured to display content to a user wearing the HMD. The virtual reality engine is configured to determine a vector for a virtual object thrown at a target in response to physical input from the user, a gaze vector for the eyes of the user and a virtual object position vector for the target and modify the vector for the virtual object based on at least one of the gaze vector and the virtual object position vector.
    Type: Grant
    Filed: May 1, 2019
    Date of Patent: November 30, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Sebastian Sztuk, Javier San Agustin Lopez, Martin Henrik Tall
  • Patent number: 10984756
    Abstract: A display system divides a screen into regions and applies a different set of rendering/encoding parameters to each region. The system applies a first set of parameters to a first region that is being viewed by a fovea of an eye of a user. The system may also apply a second set of parameters to a second region that is being viewed by a parafovea of the eye, and apply a third set of parameters to a third region that is being viewed by the area of the eye outside of the parafovea. The first set of parameters are selected to yield relatively high image quality, while the second set of parameters are yield intermediate quality, and the third set of parameters yield lower quality. As a result, the second region and the third region can be rendered, encoded, and transmitted with less computing power and less bandwidth.
    Type: Grant
    Filed: June 26, 2019
    Date of Patent: April 20, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Martin Henrik Tall, Javier San Agustin Lopez, Rasmus Dahl
  • Patent number: 10983591
    Abstract: The disclosed computer-implemented method may include identifying a region within a user's eye gaze and calculating a ranking for the identified region within the user's eye gaze. The ranking may indicate the user's level of interest in the identified region. The method may then determine how the identified region is to be presented according to the calculated ranking and present the identified region in the determined manner according to the calculated ranking. Various other methods, systems, and computer-readable media are also disclosed.
    Type: Grant
    Filed: February 25, 2019
    Date of Patent: April 20, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Andrew John Ouderkirk, Neeraj Choubey, Andre Golard, Bo Asp Andersen, Immo Andreas Schuetz, Karol Constantine Hatzilias, Kelly Ingham, Martin Henrik Tall, Sharvil Shailesh Talati, Robert Dale Cavin, Thomas Scott Murdison
  • Patent number: 10921896
    Abstract: An augmented reality (AR) device can access a library of applications or user interfaces (UIs) designed to control a set of devices. The AR device can determine which UI to present based on detection of a device to be controlled near the AR device. For example, a user wearing an AR device may look at a thermostat placed on a wall and a UI to control the thermostat may be presented to the user. The determination that the user is looking at the thermostat may be made by correlating the gaze tracking information of the user-facing camera with the location of the thermostat in an image captured by a world-facing camera. Determination of the location of the thermostat in the image can be performed using image recognition technology. The UI can be selected based on a database record pairing the UI with the thermostat.
    Type: Grant
    Filed: March 16, 2016
    Date of Patent: February 16, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Javier San Agustin Lopez, Martin Henrik Tall, Rasmus Dahl, Jonas Priesum
  • Patent number: 10909405
    Abstract: Systems and methods for virtual interest segmentation may include (1) performing a semantic segmentation of an image of a user's environment, captured by an artificial reality (AR) device being worn by the user, to identify objects within the user's environment, (2) in addition to performing the semantic segmentation, performing an interest segmentation of the image to determine a personal interest that the user may have in a particular object identified via the semantic segmentation, (3) creating virtual content relating to the particular object based on the user's personal interest in the particular object, and (4) displaying the virtual content within a display element of the AR device. Various other methods, systems, and computer-readable media are also disclosed.
    Type: Grant
    Filed: March 4, 2019
    Date of Patent: February 2, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Andre Golard, Bo Asp Andersen, Immo Andreas Schuetz, Karol Constantine Hatzilias, Kelly Ingham, Martin Henrik Tall, Neeraj Choubey, Sharvil Shailesh Talati, Robert Dale Cavin, Thomas Scott Murdison
  • Patent number: 10852820
    Abstract: Systems and methods for enabling gaze-based virtual content control may include (1) displaying an artificial scene with one or more virtual elements to a user wearing a head-mounted display system, (2) identifying the user's eye gaze based on gazing data collected by one or more sensors in the head-mounted display system, (3) determining that the user's eye gaze is focused on a specific virtual element, and (4) in response to determining that the user's eye gaze is focused on the specific virtual element, increasing the specific virtual element's visibility to the user. Various other methods, systems, and computer-readable media are also disclosed.
    Type: Grant
    Filed: February 22, 2019
    Date of Patent: December 1, 2020
    Assignee: Facebook Technologies, LLC
    Inventors: Andre Golard, Bo Asp Andersen, Immo Andreas Schuetz, Karol Constantine Hatzilias, Kelly Ingham, Martin Henrik Tall, Neeraj Choubey, Sharvil Shailesh Talati, Robert Dale Cavin, Thomas Scott Murdison
  • Patent number: 10831267
    Abstract: The disclosed computer-implemented method may include (i) determining, using an eye-tracking system, an orientation of at least one eye of a user, (ii) identifying, based at least in part on the orientation of the user's eye, a point of interest within a field of view of the user, (iii) determining that the point of interest is a candidate for tagging, and (iv) performing, in response to determining that the point of interest is the candidate for tagging, a tagging action that facilitates tagging of the point of interest. Various other methods, systems, and computer-readable media are also disclosed.
    Type: Grant
    Filed: March 7, 2019
    Date of Patent: November 10, 2020
    Assignee: Facebook Technologies, LLC
    Inventors: Andre Golard, Bo Asp Andersen, Immo Andreas Schuetz, Karol Constantine Hatzilias, Kelly Ingham, Martin Henrik Tall, Neeraj Choubey, Sharvil Shailesh Talati, Robert Dale Cavin, Thomas Scott Murdison
  • Patent number: 10831268
    Abstract: The disclosed computer-implemented method may include: identifying, using an eye-tracking system, an object within a scene viewed by a user; identifying, within a database of object interaction commands, a subset of commands that apply to the object viewed by the user; and presenting, to the user, the subset of commands that apply to the object. Various other methods, systems, devices, and computer-readable media are also disclosed.
    Type: Grant
    Filed: March 15, 2019
    Date of Patent: November 10, 2020
    Assignee: Facebook Technologies, LLC
    Inventors: Andre Golard, Bo Asp Andersen, Immo Andreas Schuetz, Karol Constantine Hatzilias, Kelly Ingham, Martin Henrik Tall, Neeraj Choubey, Thomas Scott Murdison, Sharvil Shailesh Talati, Robert Dale Cavin
  • Patent number: 10715824
    Abstract: Systems and methods for reducing, with minimal loss, optical sensor data to be conveyed to another system for processing. An eye tracking device, such as a head-mounted display (HMD), includes a sensor and circuitry. The sensor generates image data of an eye. The circuitry receives the image data, and assigns pixels of the image data to a feature region of the eye by comparing pixel values of the pixels to a threshold value. A feature region refers to an eye region of interest for eye tracking, such as a pupil or glint. The circuitry generates encoded image data by apply an encoding algorithm, such as a run-length encoding algorithm or contour encoding algorithm, to the image data for the pixels of the feature region. The circuitry transmits the encoded image data, having a smaller data size than the image data received from the sensor, for gaze contingent content rendering.
    Type: Grant
    Filed: March 16, 2017
    Date of Patent: July 14, 2020
    Assignee: Facebook Technologies, LLC
    Inventors: Martin Henrik Tall, Sune Loje, Javier San Agustin Lopez
  • Publication number: 20200209958
    Abstract: A virtual reality (VR) system includes a console an imaging device, a head mounted display (HMD) and a user input device. The console includes a virtual reality (VR) engine, a tracking module and an application store. The HMD includes a display element configured to display content to a user wearing the HMD. The virtual reality engine is configured to determine a vector for a virtual object thrown at a target in response to physical input from the user, a gaze vector for the eyes of the user and a virtual object position vector for the target and modify the vector for the virtual object based on at least one of the gaze vector and the virtual object position vector.
    Type: Application
    Filed: May 1, 2019
    Publication date: July 2, 2020
    Inventors: Sebastian Sztuk, Javier San Agustin Lopez, Martin Henrik Tall
  • Publication number: 20200211512
    Abstract: A wearable display system includes a headset and a display module in the headset. The display module includes an electronic display for displaying images to the user. A camera is provided in the headset. The camera is configured for obtaining an image of an eye area of the user. A processing module of the wearable display system is configured to use the camera to determine an offset of a current eye position of the user wearing the headset, relative to an optimal eye position in an eyebox of the headset. The processing module is configured to determine a direction of adjustment of the headset to lessen the offset and provide an instruction to perform the adjustment of the headset in the determined direction.
    Type: Application
    Filed: May 1, 2019
    Publication date: July 2, 2020
    Inventors: Sebastian Sztuk, Javier San Agustin Lopez, Anders Bo Pedersen, Martin Henrik Tall
  • Patent number: 10459520
    Abstract: Methods and systems to facilitate eye tracking control are provided. A user input is received at a computing device. Point of regard information associated with a user of the computing device is determined while the user input is being received. The point of regard information indicates a location on a display of the computing device at which the user is looking. An operation associated with a display object identified based on the point of regard information is performed when receipt of the user input is determined to have terminated.
    Type: Grant
    Filed: October 24, 2017
    Date of Patent: October 29, 2019
    Assignee: Facebook Technologies, LLC
    Inventors: Javier San Agustin Lopez, Sebastian Sztuk, Martin Henrik Tall
  • Publication number: 20190318708
    Abstract: A display system divides a screen into regions and applies a different set of rendering/encoding parameters to each region. The system applies a first set of parameters to a first region that is being viewed by a fovea of an eye of a user. The system may also apply a second set of parameters to a second region that is being viewed by a parafovea of the eye, and apply a third set of parameters to a third region that is being viewed by the area of the eye outside of the parafovea. The first set of parameters are selected to yield relatively high image quality, while the second set of parameters are yield intermediate quality, and the third set of parameters yield lower quality. As a result, the second region and the third region can be rendered, encoded, and transmitted with less computing power and less bandwidth.
    Type: Application
    Filed: June 26, 2019
    Publication date: October 17, 2019
    Inventors: Martin Henrik Tall, Javier San Agustin Lopez, Rasmus Dahl
  • Patent number: 10373592
    Abstract: A display system divides a screen into regions and applies a different set of rendering/encoding parameters to each region. The system applies a first set of parameters to a first region that is being viewed by a fovea of an eye of a user. The system may also apply a second set of parameters to a second region that is being viewed by a parafovea of the eye, and apply a third set of parameters to a third region that is being viewed by the area of the eye outside of the parafovea. The first set of parameters are selected to yield relatively high image quality, while the second set of parameters are yield intermediate quality, and the third set of parameters yield lower quality. As a result, the second region and the third region can be rendered, encoded, and transmitted with less computing power and less bandwidth.
    Type: Grant
    Filed: July 28, 2017
    Date of Patent: August 6, 2019
    Assignee: Facebook Technologies, LLC
    Inventors: Martin Henrik Tall, Javier San Agustin Lopez, Rasmus Dahl
  • Patent number: 10013056
    Abstract: A user of a computing device may interact with and control objects and applications displayed on the computing device through the user's eye movement. Detected gaze locations are correlated with actions performed by the user and compared with typical gaze locations for those actions. Based on differences between the detected and expected gaze locations, the eye tracking system can be recalibrated. An area around a gaze location encompassing a set of likely active locations can be enlarged, effectively prompting the user to interact with the desired active location again. The enlarging of the area serves to separate the active locations on the screen, reducing the probability of interpreting the user's gaze incorrectly.
    Type: Grant
    Filed: October 31, 2017
    Date of Patent: July 3, 2018
    Assignee: Facebook, Inc.
    Inventors: Javier San Augustin Lopez, Martin Henrik Tall, Benjamin Antoine Georges Lefaudeux, Henrik Hegner Tomra Skovsgaard
  • Patent number: 9961258
    Abstract: Eye tracking technology may be used in a wide range of lighting conditions and with many different and varying light levels. In some embodiments, an eye tracking device may employ active illumination (e.g., in the form of infrared light-emitting diodes (LEDs)). However, employing active illumination may reduce the battery life of the device. Under some circumstances (e.g., in a dark environment), the light intensity may be excessive and could be reduced, thereby reducing energy consumption and extending the battery life of the device. An algorithm may be used to adjust the duration of light in eye tracking systems that employ active illumination.
    Type: Grant
    Filed: February 23, 2016
    Date of Patent: May 1, 2018
    Assignee: Facebook, Inc.
    Inventors: Martin Henrik Tall, Sebastian Sztuk, Javier San Agustin Lopez, Rasmus Dahl
  • Patent number: 9952666
    Abstract: Methods and systems to facilitate eye tracking control on mobile devices are provided. An image of a portion of a user is received at an eye tracking device, where the image includes reflections caused by light emitted on the user from one or more light sources located within the eye tracking device. One or more eye features associated with an eye of the user is detected using the reflections. Point of regard information is determined using the one or more eye features, where the point of regard information indicates a location on a display of a computing device coupled to the eye tracking device at which the user was looking when the image of the portion of the user was taken. The point of regard information is sent to an application capable of performing a subsequent operation using the point of regard information.
    Type: Grant
    Filed: March 8, 2017
    Date of Patent: April 24, 2018
    Assignee: Facebook, Inc.
    Inventors: Sebastian Sztuk, Martin Henrik Tall, Javier San Agustin Lopez
  • Publication number: 20180059782
    Abstract: A user of a computing device may interact with and control objects and applications displayed on the computing device through the user's eye movement. Detected gaze locations are correlated with actions performed by the user and compared with typical gaze locations for those actions. Based on differences between the detected and expected gaze locations, the eye tracking system can be recalibrated. An area around a gaze location encompassing a set of likely active locations can be enlarged, effectively prompting the user to interact with the desired active location again. The enlarging of the area serves to separate the active locations on the screen, reducing the probability of interpreting the user's gaze incorrectly.
    Type: Application
    Filed: October 31, 2017
    Publication date: March 1, 2018
    Inventors: Javier San Agustin Lopez, Martin Henrik Tall, Benjamin Antoine Georges Lefaudeux, Henrik Hegner Tomra Skovsgaard