Patents by Inventor Mayank Bhargava

Mayank Bhargava has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20220405500
    Abstract: A computer-implemented method includes receiving a two-dimensional (2-D) side view face image of a person, identifying a bounded portion or area of the 2-D side view face image of the person as an ear region-of-interest (ROI) area showing at least a portion of an ear of the person, and processing the identified ear ROI area of the 2-D side view face image, pixel-by-pixel, through a trained fully convolutional neural network model (FCNN model) to predict a 2-D ear saddle point (ESP) location for the ear shown in the ear ROI area. The FCNN model has an image segmentation architecture.
    Type: Application
    Filed: June 21, 2021
    Publication date: December 22, 2022
    Inventors: Mayank Bhargava, Idris Syed Aleem, Yinda Zhang, Sushant Umesh Kulkarni, Rees Anwyl Simmons, Ahmed Gawish
  • Publication number: 20220343534
    Abstract: A system and method of detecting display fit measurements and/or ophthalmic measurements for a head mounted wearable computing device including a display device is provided. An image of a fitting frame worn by a user of the computing device is captured by the user, through an application running on the computing device. One or more visual markers each including a distinct pattern are detected in the image including the fitting frame. A model of the fitting frame, and configuration information associated with the fitting frame, are determined based on the detection of the pattern. A three-dimensional pose of the fitting frame is determined based on the detected visual marker(s) and patterns, and the configuration information associated with the fitting frame. The display device of the head mounted wearable computing device can then be configured based on the three-dimensional pose of the fitting frame as captured in the image.
    Type: Application
    Filed: April 23, 2021
    Publication date: October 27, 2022
    Inventors: Idris Syed Aleem, Rees Anwyl Samuel Simmons, Sushant Umesh Kulkarni, Ahmed Gawish, Mayank Bhargava
  • Patent number: 11475592
    Abstract: An ear saddle point of a subject's ear is determined to produce specifications to fit a wearable apparatus to a subject's head. A boundary of the subject's ear in a profile image is determined using a first model. A map of the subject's ear is generated indicating probable locations of a two-dimensional ear saddle point. A most probable location of the two-dimensional ear saddle point is determined based on the probability map. The two-dimensional ear saddle point is projected onto a three-dimensional mesh surface representing the subject's head. A maximum depth of the three-dimensional mesh surface is determined in a defined region around the projected two-dimensional ear saddle point. The ear saddle point is computed based on the projected two-dimensional ear saddle point and the determined maximum depth. Specifications of a wearable apparatus are generated to fit the apparatus to the subject's head based on the computed ear saddle point.
    Type: Grant
    Filed: February 11, 2020
    Date of Patent: October 18, 2022
    Assignee: Google LLC
    Inventors: Idris Aleem, Mayank Bhargava, Rees Simmons
  • Patent number: 11157077
    Abstract: A method of tracking an eye of a user on a wearable heads-up display (WHUD) worn on the head of the user includes generating infrared light over an eye tracking period, scanning the infrared light over the eye, and detecting reflections of the infrared light from the eye. A motion parameter that is sensitive to motion of the WHUD is measured. Eye tracking is performed in a first mode that is based on glint for values of the motion parameter that fall within a first range of motion parameter values for which an error in measurement of glint position. Eye tracking is performed in a second mode that is based on glint-pupil vector for values of the motion parameter that fall within a second range of motion parameter values for which an error in measurement of glint position exceeds the error threshold. A head-mounted apparatus with eye tracking is disclosed.
    Type: Grant
    Filed: April 5, 2019
    Date of Patent: October 26, 2021
    Assignee: GOOGLE LLC
    Inventors: Idris S. Aleem, Andrew S. Logan, Mayank Bhargava
  • Patent number: 11093034
    Abstract: A method of tracking a gaze position of an eye in a target space in a field of view of the eye over an eye tracking period includes performing a plurality of scans of the eye with infrared light within the eye tracking period. Each scan includes generating infrared light signals over a scan period and projecting the infrared light signals from a plurality of virtual light projectors to the eye to form a plurality of illumination areas on the eye. Reflections of the infrared light signals from the eye are detected for each scan. The gaze position of the eye in the target space is determined from the detected reflections of the infrared light signals for each scan.
    Type: Grant
    Filed: October 30, 2019
    Date of Patent: August 17, 2021
    Assignee: Google LLC
    Inventors: Idris S. Aleem, Mayank Bhargava, Andrew S. Logan
  • Publication number: 20210223863
    Abstract: A method of tracking an eye of a user includes generating an infrared light, scanning the infrared light over the eye, and detecting reflections of the infrared light from the eye over an eye tracking period. A plurality of glints is identified from the reflections of the infrared light detected. A glint center position of each glint in a glint space is determined and transformed to a gaze position in a display space. At least once during the eye tracking period, an image of the eye is reconstructed from a portion of the reflections of the infrared light detected. A pupil is detected from the image, and a pupil center position is determined. A glint-pupil vector is determined from the pupil center position and the glint center position of at least one glint corresponding in space to the pupil. The glint space is recalibrated based on the glint-pupil vector.
    Type: Application
    Filed: April 7, 2021
    Publication date: July 22, 2021
    Inventors: Idris S. Aleem, Andrew S. Logan, Mayank Bhargava
  • Patent number: 10976814
    Abstract: A method of tracking an eye of a user includes generating an infrared light, scanning the infrared light over the eye, and detecting reflections of the infrared light from the eye over an eye tracking period. A plurality of glints is identified from the reflections of the infrared light detected. A glint center position of each glint in a glint space is determined and transformed to a gaze position in a display space. At least once during the eye tracking period, an image of the eye is reconstructed from a portion of the reflections of the infrared light detected. A pupil is detected from the image, and a pupil center position is determined. A glint-pupil vector is determined from the pupil center position and the glint center position of at least one glint corresponding in space to the pupil. The glint space is recalibrated based on the glint-pupil vector.
    Type: Grant
    Filed: April 5, 2019
    Date of Patent: April 13, 2021
    Assignee: GOOGLE LLC
    Inventors: Idris S. Aleem, Andrew S. Logan, Mayank Bhargava
  • Patent number: 10936056
    Abstract: A method of tracking an eye of a user includes generating infrared light over an eye tracking period, scanning the infrared light over the eye, and detecting reflections of the infrared light from the eye. Shifts in a position of a wearable heads-up display (WHUD) worn on the head of the user are detected during at least a portion of the eye tracking period. Glints are identified from the detected reflections of the infrared light. A drift in a glint center position of an identified glint relative to a glint space is determined based on a detected shift in position of the WHUD corresponding in space to the identified glint. The glint center position is adjusted to compensate for the drift. The adjusted glint center position is transformed from the glint space to a gaze position in a display space in a field of view of the eye.
    Type: Grant
    Filed: April 5, 2019
    Date of Patent: March 2, 2021
    Assignee: Google LLC
    Inventors: Idris S. Aleem, Andrew S. Logan, Mayank Bhargava
  • Publication number: 20200258255
    Abstract: An ear saddle point of a subject's ear is determined to produce specifications to fit a wearable apparatus to a subject's head. A boundary of the subject's ear in a profile image is determined using a first model. A map of the subject's ear is generated indicating probable locations of a two-dimensional ear saddle point. A most probable location of the two-dimensional ear saddle point is determined based on the probability map. The two-dimensional ear saddle point is projected onto a three-dimensional mesh surface representing the subject's head. A maximum depth of the three-dimensional mesh surface is determined in a defined region around the projected two-dimensional ear saddle point. The ear saddle point is computed based on the projected two-dimensional ear saddle point and the determined maximum depth. Specifications of a wearable apparatus are generated to fit the apparatus to the subject's head based on the computed ear saddle point.
    Type: Application
    Filed: February 11, 2020
    Publication date: August 13, 2020
    Inventors: Idris Aleem, Mayank Bhargava, Rees Simmons
  • Publication number: 20200142479
    Abstract: A method of tracking a gaze position of an eye in a target space in a field of view of the eye over an eye tracking period includes performing a plurality of scans of the eye with infrared light within the eye tracking period. Each scan includes generating infrared light signals over a scan period and projecting the infrared light signals from a plurality of virtual light projectors to the eye to form a plurality of illumination areas on the eye. Reflections of the infrared light signals from the eye are detected for each scan. The gaze position of the eye in the target space is determined from the detected reflections of the infrared light signals for each scan.
    Type: Application
    Filed: October 30, 2019
    Publication date: May 7, 2020
    Inventors: Idris S. Aleem, Mayank Bhargava, Andrew S. Logan
  • Patent number: 10579141
    Abstract: Systems, methods and articles that provide dynamic calibration of eye tracking systems for wearable heads-up displays (WHUDs). The eye tracking system may determine a user's gaze location on a display of the WHUD utilizing a calibration point model that includes a plurality of calibration points. During regular use of the WHUD by the user, the calibration point model may be dynamically updated based on the user's interaction with user interface (UI) elements presented on the display. The UI elements may be specifically designed (e.g., shaped, positioned, displaced) to provide in-use and on-going dynamic calibration of the eye tracking system, which in at least some implementations may be unnoticeable to the user.
    Type: Grant
    Filed: July 16, 2018
    Date of Patent: March 3, 2020
    Assignee: North Inc.
    Inventors: Idris S. Aleem, Mayank Bhargava, Dylan Jacobs
  • Patent number: 10459220
    Abstract: Systems, devices, and methods that use elements of a scanning laser projector (“SLP”) to determine the gaze direction of a user of a wearable heads-up display (“WHUD”) are described. An infrared laser diode is added to an RGB SLP and an infrared photodetector is aligned to detect reflections of the infrared light from the eye. A scan mirror in the SLP sweeps through a range of orientations and the intensities of reflections of the infrared light are monitored by a processor to determine when a spectral reflection or “glint” is produced. The processor determines the orientation of the scan mirror that produced the glint and maps the scan mirror orientation to a region in the field of view of the eye of the user, such as a region in visible display content projected by the WHUD, to determine the gaze direction of the user.
    Type: Grant
    Filed: November 30, 2017
    Date of Patent: October 29, 2019
    Assignee: North Inc.
    Inventors: Idris S. Aleem, Mayank Bhargava
  • Publication number: 20190324532
    Abstract: A method of tracking an eye of a user on a wearable heads-up display (WHUD) worn on the head of the user includes generating infrared light over an eye tracking period, scanning the infrared light over the eye, and detecting reflections of the infrared light from the eye. A motion parameter that is sensitive to motion of the WHUD is measured. Eye tracking is performed in a first mode that is based on glint for values of the motion parameter that fall within a first range of motion parameter values for which an error in measurement of glint position. Eye tracking is performed in a second mode that is based on glint-pupil vector for values of the motion parameter that fall within a second range of motion parameter values for which an error in measurement of glint position exceeds the error threshold. A head-mounted apparatus with eye tracking is disclosed.
    Type: Application
    Filed: April 5, 2019
    Publication date: October 24, 2019
    Inventors: Idris S. Aleem, Andrew S. Logan, Mayank Bhargava
  • Publication number: 20190317598
    Abstract: A method of tracking an eye of a user includes generating infrared light over an eye tracking period, scanning the infrared light over the eye, and detecting reflections of the infrared light from the eye. Shifts in a position of a wearable heads-up display (WHUD) worn on the head of the user are detected during at least a portion of the eye tracking period. Glints are identified from the detected reflections of the infrared light. A drift in a glint center position of an identified glint relative to a glint space is determined based on a detected shift in position of the WHUD corresponding in space to the identified glint. The glint center position is adjusted to compensate for the drift. The adjusted glint center position is transformed from the glint space to a gaze position in a display space in a field of view of the eye.
    Type: Application
    Filed: April 5, 2019
    Publication date: October 17, 2019
    Inventors: Idris S. Aleem, Andrew S. Logan, Mayank Bhargava
  • Publication number: 20190317597
    Abstract: A method of tracking an eye of a user includes generating an infrared light, scanning the infrared light over the eye, and detecting reflections of the infrared light from the eye over an eye tracking period. A plurality of glints is identified from the reflections of the infrared light detected. A glint center position of each glint in a glint space is determined and transformed to a gaze position in a display space. At least once during the eye tracking period, an image of the eye is reconstructed from a portion of the reflections of the infrared light detected. A pupil is detected from the image, and a pupil center position is determined. A glint-pupil vector is determined from the pupil center position and the glint center position of at least one glint corresponding in space to the pupil. The glint space is recalibrated based on the glint-pupil vector.
    Type: Application
    Filed: April 5, 2019
    Publication date: October 17, 2019
    Inventors: Idris S. Aleem, Andrew S. Logan, Mayank Bhargava
  • Patent number: 10409057
    Abstract: Systems, devices, and methods that use elements of a scanning laser projector (“SLP”) to determine the gaze direction of a user of a wearable heads-up display (“WHUD”) are described. An infrared laser diode is added to an RGB SLP and an infrared photodetector is aligned to detect reflections of the infrared light from the eye. A scan mirror in the SLP sweeps through a range of orientations and the intensities of reflections of the infrared light are monitored by a processor to determine when a spectral reflection or “glint” is produced. The processor determines the orientation of the scan mirror that produced the glint and maps the scan mirror orientation to a region in the field of view of the eye of the user, such as a region in visible display content projected by the WHUD, to determine the gaze direction of the user.
    Type: Grant
    Filed: November 30, 2017
    Date of Patent: September 10, 2019
    Assignee: NORTH INC.
    Inventors: Idris S. Aleem, Mayank Bhargava
  • Publication number: 20190018480
    Abstract: Systems, methods and articles that provide dynamic calibration of eye tracking systems for wearable heads-up displays (WHUDs). The eye tracking system may determine a user's gaze location on a display of the WHUD utilizing a calibration point model that includes a plurality of calibration points. During regular use of the WHUD by the user, the calibration point model may be dynamically updated based on the user's interaction with user interface (UI) elements presented on the display. The UI elements may be specifically designed (e.g., shaped, positioned, displaced) to provide in-use and on-going dynamic calibration of the eye tracking system, which in at least some implementations may be unnoticeable to the user.
    Type: Application
    Filed: July 16, 2018
    Publication date: January 17, 2019
    Inventors: Idris S. Aleem, Mayank Bhargava, Dylan Jacobs
  • Publication number: 20190018485
    Abstract: Systems, methods and articles that provide dynamic calibration of eye tracking systems for wearable heads-up displays (WHUDs). The eye tracking system may determine a user's gaze location on a display of the WHUD utilizing a calibration point model that includes a plurality of calibration points. During regular use of the WHUD by the user, the calibration point model may be dynamically updated based on the user's interaction with user interface (UI) elements presented on the display. The UI elements may be specifically designed (e.g., shaped, positioned, displaced) to provide in-use and on-going dynamic calibration of the eye tracking system, which in at least some implementations may be unnoticeable to the user.
    Type: Application
    Filed: July 16, 2018
    Publication date: January 17, 2019
    Inventors: Idris S. Aleem, Mayank Bhargava, Dylan Jacobs
  • Publication number: 20190018482
    Abstract: Systems, methods and articles that provide dynamic calibration of eye tracking systems for wearable heads-up displays (WHUDs). The eye tracking system may determine a user's gaze location on a display of the WHUD utilizing a calibration point model that includes a plurality of calibration points. During regular use of the WHUD by the user, the calibration point model may be dynamically updated based on the user's interaction with user interface (UI) elements presented on the display. The UI elements may be specifically designed (e.g., shaped, positioned, displaced) to provide in-use and on-going dynamic calibration of the eye tracking system, which in at least some implementations may be unnoticeable to the user.
    Type: Application
    Filed: July 16, 2018
    Publication date: January 17, 2019
    Inventors: Idris S. Aleem, Mayank Bhargava, Dylan Jacobs
  • Publication number: 20190018483
    Abstract: Systems, methods and articles that provide dynamic calibration of eye tracking systems for wearable heads-up displays (WHUDs). The eye tracking system may determine a user's gaze location on a display of the WHUD utilizing a calibration point model that includes a plurality of calibration points. During regular use of the WHUD by the user, the calibration point model may be dynamically updated based on the user's interaction with user interface (UI) elements presented on the display. The UI elements may be specifically designed (e.g., shaped, positioned, displaced) to provide in-use and on-going dynamic calibration of the eye tracking system, which in at least some implementations may be unnoticeable to the user.
    Type: Application
    Filed: July 16, 2018
    Publication date: January 17, 2019
    Inventors: Idris S. Aleem, Mayank Bhargava, Dylan Jacobs