Patents by Inventor Walter Nistico

Walter Nistico has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240069631
    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).
    Type: Application
    Filed: May 5, 2023
    Publication date: February 29, 2024
    Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
  • Patent number: 11856305
    Abstract: In one implementation, an event sensor includes a plurality of pixels and an event compiler. The plurality of pixels are positioned to receive light from a scene disposed within a field of view of the event sensor. Each pixel is configured to have an operational state that is modified by control signals generated by a respective state circuit. The event compiler is configured to output a stream of pixel events. Each respective pixel event corresponds to a breach of a comparator threshold related to an intensity of incident illumination. Each control signal is generated based on feedback information that is received from an image pipeline configured to consume image data derived from the stream of pixel events.
    Type: Grant
    Filed: August 13, 2021
    Date of Patent: December 26, 2023
    Assignee: Apple Inc.
    Inventors: Emanuele Mandelli, Walter Nistico
  • Patent number: 11831981
    Abstract: In one implementation, a system includes an event sensor with a pixel array and an image pipeline. The pixel array is configured to operate a first subset of pixels in an active state and a second subset of pixels in an inactive state. The event sensor is configured to output pixel events. Each respective pixel event is generated in response to a specific pixel within the first subset of pixels detecting a change in light intensity that exceeds a comparator threshold. The image pipeline is configured to consume image data derived from the pixel events and communicate feedback information to the event sensor based on the image data. The feedback information causes a pixel within the first subset of pixels to transition from the active state to another state.
    Type: Grant
    Filed: August 12, 2021
    Date of Patent: November 28, 2023
    Assignee: Apple Inc.
    Inventor: Walter Nistico
  • Publication number: 20230333371
    Abstract: An eye tracking device includes a head-mountable frame, an optical sensor subsystem mounted to the head-mountable frame, and a processor. The optical sensor subsystem includes a set of one or more SMI sensors. The processor is configured to operate the optical sensor subsystem to cause the set of one or more SMI sensors to emit a set of one or more beams of light toward an eye of a user; to receive a set of one or more SMI signals from the set of one or more SMI sensors; and to track a movement of the eye using the set of one or more SMI signals.
    Type: Application
    Filed: February 3, 2023
    Publication date: October 19, 2023
    Inventors: Tong Chen, Mark T. Winkler, Edward Vail, Xibin Zhou, Ahmet Fatih Cihan, Walter Nistico
  • Patent number: 11644896
    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).
    Type: Grant
    Filed: March 29, 2022
    Date of Patent: May 9, 2023
    Assignee: APPLE INC.
    Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
  • Publication number: 20220214747
    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).
    Type: Application
    Filed: March 29, 2022
    Publication date: July 7, 2022
    Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
  • Publication number: 20220139084
    Abstract: Various implementations disclosed herein include devices, systems, and methods that obtain information from a first electronic device (e.g., to improve interactions for a CGR environment). In some implementations at a second electronic device having a processor, event camera data is obtained corresponding to modulated light (e.g., light including an information signal in its modulation) emitted by an optical source on the first electronic device and received at an event camera. The second electronic device is able to identify information from the first electronic device based on detecting a modulation pattern of the modulated light based on the event camera data. In some implementations, second electronic device is the same electronic device that has the event camera (e.g., laptop) or a different electronic device that receives the event data (e.g., a server receiving the event data from a laptop computer).
    Type: Application
    Filed: January 20, 2022
    Publication date: May 5, 2022
    Inventors: Emanuele MANDELLI, Sai Harsha JANDHYALA, Walter NISTICO
  • Patent number: 11320904
    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).
    Type: Grant
    Filed: March 9, 2021
    Date of Patent: May 3, 2022
    Assignee: APPLE INC.
    Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
  • Publication number: 20220061660
    Abstract: The invention relates to a method for determining at least one parameter of two eyes (10l, 10r) of a test person (31), the method comprising the following steps: optically capturing of a first eye (10l; 10r) of the two eyes (10l, 10r) by means of a first capturing unit (3l; 3r); optically capturing the second eye (10r; 10l) of the two eyes (10l, 10r) by means of a second capturing unit (3r; 3l); transmitting first signals concerning the captured first eye (10l; 10r) from the first capturing unit (3l; 3r) to an analysis unit (27) and transmitting second sig-nals concerning the captured second eye (10r; 10l) from the second capturing unit (3r; 3l) to the analysis unit (27); determining the at least one parameter of the two eyes (10l, 10r) on the basis of the transmitted first and second signals in the analysis unit (27), characterized by the following step: setting a first data rate for the first signals and a second data rate for the sec-ond signals, wherein the first and the second data rate differ from
    Type: Application
    Filed: September 9, 2021
    Publication date: March 3, 2022
    Inventors: Walter Nistico, Jan Hoffmann, Eberhard Schmidt
  • Publication number: 20210378509
    Abstract: Various implementations determine a pupil characteristic (e.g., perimeter location, pupil shape, pupil diameter, pupil center, etc.) using on-axis illumination. On-axis illumination involves producing light from a light source that is approximately on-axis with an image sensor configured to capture reflections of the light from the light source off the eye. The light from the light source may passes through the pupil into the eye and reflect off the retina to produce a bright pupil-type light pattern in data obtained by the image sensor. The light may be pulsed at a frequency so that frequency segmentation may be used to distinguish reflections through the pupil off the retina from reflections of light from other light sources. In some implementations, the image sensor is an event camera that detects events. The pupil characteristics may be assessed by assessing events that recur in a given area at the given frequency.
    Type: Application
    Filed: August 25, 2021
    Publication date: December 9, 2021
    Inventor: Walter NISTICO
  • Publication number: 20210377512
    Abstract: In one implementation, a method includes receiving pixel events output by an event sensor that correspond to a feature disposed within a field of view of the event sensor. Each respective pixel event is generated in response to a specific pixel within a pixel array of the event sensor detecting a change in light intensity that exceeds a comparator threshold. A characteristic of the feature is determined at a first time based on the pixel events and a previous characteristic of the feature at a second time that precedes the first time. Movement of the feature relative to the event sensor is tracked over time based on the characteristic and the previous characteristic.
    Type: Application
    Filed: August 13, 2021
    Publication date: December 2, 2021
    Inventor: Walter NISTICO
  • Publication number: 20210377465
    Abstract: In one implementation, an event sensor includes a plurality of pixels and an event compiler. The plurality of pixels are positioned to receive light from a scene disposed within a field of view of the event sensor. Each pixel is configured to have an operational state that is modified by control signals generated by a respective state circuit. The event compiler is configured to output a stream of pixel events. Each respective pixel event corresponds to a breach of a comparator threshold related to an intensity of incident illumination. Each control signal is generated based on feedback information that is received from an image pipeline configured to consume image data derived from the stream of pixel events.
    Type: Application
    Filed: August 13, 2021
    Publication date: December 2, 2021
    Inventors: Emanuele MANDELLI, Walter NISTICO
  • Publication number: 20210377453
    Abstract: In one implementation, a system includes an event sensor with a pixel array and an image pipeline. The pixel array is configured to operate a first subset of pixels in an active state and a second subset of pixels in an inactive state. The event sensor is configured to output pixel events. Each respective pixel event is generated in response to a specific pixel within the first subset of pixels detecting a change in light intensity that exceeds a comparator threshold. The image pipeline is configured to consume image data derived from the pixel events and communicate feedback information to the event sensor based on the image data. The feedback information causes a pixel within the first subset of pixels to transition from the active state to another state.
    Type: Application
    Filed: August 12, 2021
    Publication date: December 2, 2021
    Inventor: Walter NISTICO
  • Publication number: 20210334992
    Abstract: Various implementations disclosed herein include techniques for estimating depth using sensor data indicative of changes in light intensity. In one implementation a method includes acquiring pixel events output by an event sensor that correspond to a scene disposed within a field of view of the event sensor. Each respective pixel event is generated in response to a specific pixel sensor within a pixel array of the event sensor detecting a change in light intensity that exceeds a comparator threshold. Mapping data is generated by correlating the pixel events with multiple illumination patterns projected by an optical system towards the scene. Depth data is determined for the scene relative to a reference position based on the mapping data.
    Type: Application
    Filed: April 21, 2021
    Publication date: October 28, 2021
    Inventor: Walter Nistico
  • Publication number: 20210223862
    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).
    Type: Application
    Filed: March 9, 2021
    Publication date: July 22, 2021
    Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
  • Patent number: 11009945
    Abstract: The invention relates to a method for operating an eye tracking device (10) for multi¬user eye tracking, wherein images (24) of a predefined capturing area (14) of the eye tracking device (10) are captured by means of an imaging device (12) of the eye tracking device (10) and the captured images (24) are processed by means of a processing unit (16) of the eye tracking device (10). If a first user (26a) and a second user (26b) are present in the predefined capturing area (14) of the eye tracking device (10), a first information relating to the first user (26a) and a second information relating to the second user (26b) are determined on the basis of the captured images (24) by processing the images (24). Furthermore the images (24) are captured successively in a predeterminable time sequence.
    Type: Grant
    Filed: July 30, 2019
    Date of Patent: May 18, 2021
    Assignee: APPLE INC.
    Inventors: Fabian Wanner, Matthias Nieser, Kun Liu, Walter Nistico
  • Patent number: 11003245
    Abstract: The invention relates to a method for automatically identifying at least one user of an eye tracking device (10) by means of the eye tracking device (10), wherein user identification data of the at least one user are captured by a capturing device (14) of the eye tracking device (10). Under the first condition that at least one profile (P1, P2, P3) with associated identification data (I1, I2, I3) of at least one specific user is stored in a storage medium (20), the stored identification data (I1, I2, I3) of the at least one profile (P1, P2, P3) are compared with the captured user identification data, and under the second condition that the captured user identification data match the stored identification data (I1, I2, I3) according to a predefined criterion, the at least one user is identified as the at least one specific user, for whom the at least one profile (P1, P2, P3) was stored.
    Type: Grant
    Filed: October 29, 2019
    Date of Patent: May 11, 2021
    Assignee: APPLE INC.
    Inventors: Kun Liu, Fabian Wanner, Walter Nistico, Matthias Nieser
  • Patent number: 10976813
    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).
    Type: Grant
    Filed: June 12, 2017
    Date of Patent: April 13, 2021
    Assignee: APPLE INC.
    Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
  • Publication number: 20210068652
    Abstract: Various implementations determine gaze direction based on a cornea center and (a) a pupil center or (b) an eyeball center. The cornea center is determined using a directional light source to produce one or more glints reflected from the surface of the eye and captured by a sensor. The angle (e.g., direction) of the light from the directional light source may be known, for example, using an encoder that records the orientation of the light source. The known direction of the light source facilitates determining the distance of the glint on the cornea and enables the cornea position to be determined, for example, based on a single glint. The cornea center can be determined (e.g., using an average cornea radius, or a previously measured cornea radius or using information from a second glint). The cornea center and a pupil center or eyeball center may be used to determine gaze direction.
    Type: Application
    Filed: September 2, 2020
    Publication date: March 11, 2021
    Inventor: Walter Nistico
  • Patent number: 10893801
    Abstract: The invention relates to an eye tracking system (10) and a method for operating an eye tracking system (10) for determining if one of a left and a right eye of a user is dominant, wherein at least one image of the left and the right eye of the user is captured (S30), based on the at least one image and according to a predefined accuracy function a left accuracy score for the left eye (S34a) and a right accuracy score for the right eye (S34c) is determined and it is determined if one of the left and the right eye of the user is dominant in dependency of at least the left and the right accuracy score (S36). Thereby user-specific properties relating to his eyes can be provided and considered when performing eye tracking so that the robustness and accuracy of eye tracking can be enhanced.
    Type: Grant
    Filed: June 14, 2018
    Date of Patent: January 19, 2021
    Assignee: APPLE INC.
    Inventors: Kun Liu, Walter Nistico, Hao Qin