Patents by Inventor Soumil Chugh

Soumil Chugh has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250182300
    Abstract: Methods and systems are described for estimating an individual's gaze direction using a position of a single corneal reflection, a position of a pupil center and positional information for an individual's head as inputs. An image is obtained from an infrared (IR) camera and a position of a corneal reflection is estimated in the image. Positional information for the user's head is determined from the image and a 3D position of a cornea center for the user's eye is estimated based on the position of the corneal reflection, a position of an IR light source and the positional information for the individual's head. A position of a 3D pupil center for the user's eye is then estimated and a 3D gaze vector representing a gaze direction is estimated based on the position of the cornea center and the position of the 3D pupil center.
    Type: Application
    Filed: February 10, 2025
    Publication date: June 5, 2025
    Inventors: Soumil CHUGH, Juntao YE, Moshe EIZENMAN
  • Publication number: 20250182328
    Abstract: The present disclosure allows for accurate and computationally efficient gaze tracking without requiring a corneal reflection. Eye tracking systems and methods are provided. They include obtaining images of the user's face, detecting a head pose, and locating the centre eye rotation, the pupil centre, the optical axis, the cornea centre and finally the visual axis corresponding to the gaze direction using user-specific parameters. Calibration systems and methods are also provided for acquiring the user-specific parameters. They include at least twice displaying a target point for the user to glaze at, directing light to obtain a corneal reflection, obtaining an image of the user's face, detecting the head pose, inferring a transformation to convert coordinates between the world coordinate system and a head coordinate system, and locating the corneal reflection, the cornea centre, the pupil centres and the optical axes. The data acquired thereby is used to compute the parameters.
    Type: Application
    Filed: November 30, 2023
    Publication date: June 5, 2025
    Inventors: Soumil CHUGH, Juntao YE, Moshe EIZENMAN
  • Patent number: 12175014
    Abstract: Methods and systems for estimating a gaze direction of an individual using a trained neural network. Inputs to the neural network include a face image and an image of a visually significant eye in the face image. Feature representations are extracted for the face image and significant eye image and feature fusion is performed on the feature representations to generate a fused feature representation. The fused feature representation is input into a trained gaze estimator to output a gaze vector including gaze angles, the gaze vector representing a gaze direction. The disclosed network may enable gaze estimation performance on user devices typically having limited hardware and computational resources such as mobile devices.
    Type: Grant
    Filed: November 29, 2021
    Date of Patent: December 24, 2024
    Assignee: HUAWEI TECHNOLOGIES CO., LTD.
    Inventors: Soumil Chugh, Juntao Ye
  • Publication number: 20240211034
    Abstract: Methods and systems for gaze assisted interaction with a pointing device on a display screen. In response to receiving an activation input, a user's point of gaze (POG) on a display is received and a gaze region of the display corresponding to the POG is extracted, enlarged and transposed on the display according to a first cursor location, generating an interaction region on the display. A user interaction with the pointing device at a second cursor location on the display associated with the interaction region is intercepted in a system hook, mapped to a location on the display corresponding to the gaze region and passed to an application. The disclosed method and system may enable improved GUI interaction with pointing devices on displays while overcoming challenges associated with the precision of eye-gaze assisted interaction, including the impact of eye jittering on gaze estimation.
    Type: Application
    Filed: December 23, 2022
    Publication date: June 27, 2024
    Inventors: Juntao YE, Manpreet Singh TAKKAR, Soumil CHUGH
  • Publication number: 20230168735
    Abstract: Methods and systems for estimating a gaze direction of an individual using a trained neural network. Inputs to the neural network include a face image and an image of a visually significant eye in the face image. Feature representations are extracted for the face image and significant eye image and feature fusion is performed on the feature representations to generate a fused feature representation. The fused feature representation is input into a trained gaze estimator to output a gaze vector including gaze angles, the gaze vector representing a gaze direction. The disclosed network may enable gaze estimation performance on user devices typically having limited hardware and computational resources such as mobile devices.
    Type: Application
    Filed: November 29, 2021
    Publication date: June 1, 2023
    Inventors: Soumil Chugh, Juntao Ye