Patents by Inventor Walter Nistico
Walter Nistico has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250113998Abstract: Various implementations determine gaze direction based on a cornea center and (a) a pupil center or (b) an eyeball center. The cornea center is determined using a directional light source to produce one or more glints reflected from the surface of the eye and captured by a sensor. The angle (e.g., direction) of the light from the directional light source may be known, for example, using an encoder that records the orientation of the light source. The known direction of the light source facilitates determining the distance of the glint on the cornea and enables the cornea position to be determined, for example, based on a single glint. The cornea center can be determined (e.g., using an average cornea radius, or a previously measured cornea radius or using information from a second glint). The cornea center and a pupil center or eyeball center may be used to determine gaze direction.Type: ApplicationFiled: December 13, 2024Publication date: April 10, 2025Inventor: Walter Nistico
-
Patent number: 12248626Abstract: Various implementations disclosed herein include devices, systems, and methods that perform gaze tracking using combinations of ultrasound and imaging data. Some implementations detect a first attribute of an eye using imaging measurements in a first image of the eye, and a first location associated with the first attribute is determined in a 3D coordinate system based on depth information. A second attribute of the eye is detected based on depth information from ultrasound based measurements of the eye, and a second location associated with the second attribute is determined in the 3D coordinate system based on the depth information and a 3D model of a portion of the eye. A gaze direction in the 3D coordinate system is determined based on the first location and the second location. In some implementations, user characteristics are identified (e.g., authentication) using ultrasonic biometric characteristics with imaging biometrics.Type: GrantFiled: August 5, 2022Date of Patent: March 11, 2025Assignee: Apple Inc.Inventors: Walter Nistico, Patrick R. Gill
-
Publication number: 20240398224Abstract: Various implementations disclosed herein include devices, systems, and methods that track a state of a user's eye (e.g., position/orientation, gaze direction accommodation, pupil dilation, etc.) using coherence-based measurement (e.g., optical coherence tomography (OCT)). The coherence-based measurement may provide sub-surface information, e.g., depth, cross section, or a volumetric model of the eye, based on reflections/scattering of light (e.g., using relatively long wavelength light to penetrate into the eye tissue).Type: ApplicationFiled: September 20, 2022Publication date: December 5, 2024Inventors: Walter NISTICO, Avery L. WANG, Edward VAIL, Huiyang DENG, Samuel STEVEN, Seyedeh Mahsa KAMALI, Tong CHEN, Xibin ZHOU
-
Publication number: 20240353923Abstract: Various implementations disclosed herein include devices, systems, and methods that perform gaze tracking using combinations of ultrasound and imaging data. Some implementations detect a first attribute of an eye using imaging measurements in a first image of the eye, and a first location associated with the first attribute is determined in a 3D coordinate system based on depth information. A second attribute of the eye is detected based on depth information from ultrasound based measurements of the eye, and a second location associated with the second attribute is determined in the 3D coordinate system based on the depth information and a 3D model of a portion of the eye. A gaze direction in the 3D coordinate system is determined based on the first location and the second location. In some implementations. user characteristics are identified (e.g., authentication) using ultrasonic biometric characteristics with imaging biometrics.Type: ApplicationFiled: August 5, 2022Publication date: October 24, 2024Inventors: Walter NISTICO, Patrick R. GILL
-
Patent number: 12113954Abstract: In one implementation, a method includes receiving pixel events output by an event sensor that correspond to a feature disposed within a field of view of the event sensor. Each respective pixel event is generated in response to a specific pixel within a pixel array of the event sensor detecting a change in light intensity that exceeds a comparator threshold. A characteristic of the feature is determined at a first time based on the pixel events and a previous characteristic of the feature at a second time that precedes the first time. Movement of the feature relative to the event sensor is tracked over time based on the characteristic and the previous characteristic.Type: GrantFiled: August 13, 2021Date of Patent: October 8, 2024Assignee: Apple Inc.Inventor: Walter Nistico
-
Publication number: 20240310908Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device, wherein a stimulus object is displayed within a certain display area, such that the stimulus object is at least temporarily moving along a defined trajectory and images of at least one eye of at least one user are captured during the displaying of the stimulus object.Type: ApplicationFiled: May 23, 2024Publication date: September 19, 2024Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
-
Patent number: 12026309Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).Type: GrantFiled: May 5, 2023Date of Patent: July 2, 2024Assignee: APPLE INC.Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
-
Publication number: 20240069631Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).Type: ApplicationFiled: May 5, 2023Publication date: February 29, 2024Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
-
Patent number: 11856305Abstract: In one implementation, an event sensor includes a plurality of pixels and an event compiler. The plurality of pixels are positioned to receive light from a scene disposed within a field of view of the event sensor. Each pixel is configured to have an operational state that is modified by control signals generated by a respective state circuit. The event compiler is configured to output a stream of pixel events. Each respective pixel event corresponds to a breach of a comparator threshold related to an intensity of incident illumination. Each control signal is generated based on feedback information that is received from an image pipeline configured to consume image data derived from the stream of pixel events.Type: GrantFiled: August 13, 2021Date of Patent: December 26, 2023Assignee: Apple Inc.Inventors: Emanuele Mandelli, Walter Nistico
-
Patent number: 11831981Abstract: In one implementation, a system includes an event sensor with a pixel array and an image pipeline. The pixel array is configured to operate a first subset of pixels in an active state and a second subset of pixels in an inactive state. The event sensor is configured to output pixel events. Each respective pixel event is generated in response to a specific pixel within the first subset of pixels detecting a change in light intensity that exceeds a comparator threshold. The image pipeline is configured to consume image data derived from the pixel events and communicate feedback information to the event sensor based on the image data. The feedback information causes a pixel within the first subset of pixels to transition from the active state to another state.Type: GrantFiled: August 12, 2021Date of Patent: November 28, 2023Assignee: Apple Inc.Inventor: Walter Nistico
-
Publication number: 20230333371Abstract: An eye tracking device includes a head-mountable frame, an optical sensor subsystem mounted to the head-mountable frame, and a processor. The optical sensor subsystem includes a set of one or more SMI sensors. The processor is configured to operate the optical sensor subsystem to cause the set of one or more SMI sensors to emit a set of one or more beams of light toward an eye of a user; to receive a set of one or more SMI signals from the set of one or more SMI sensors; and to track a movement of the eye using the set of one or more SMI signals.Type: ApplicationFiled: February 3, 2023Publication date: October 19, 2023Inventors: Tong Chen, Mark T. Winkler, Edward Vail, Xibin Zhou, Ahmet Fatih Cihan, Walter Nistico
-
Patent number: 11644896Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).Type: GrantFiled: March 29, 2022Date of Patent: May 9, 2023Assignee: APPLE INC.Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
-
Publication number: 20220214747Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).Type: ApplicationFiled: March 29, 2022Publication date: July 7, 2022Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
-
Publication number: 20220139084Abstract: Various implementations disclosed herein include devices, systems, and methods that obtain information from a first electronic device (e.g., to improve interactions for a CGR environment). In some implementations at a second electronic device having a processor, event camera data is obtained corresponding to modulated light (e.g., light including an information signal in its modulation) emitted by an optical source on the first electronic device and received at an event camera. The second electronic device is able to identify information from the first electronic device based on detecting a modulation pattern of the modulated light based on the event camera data. In some implementations, second electronic device is the same electronic device that has the event camera (e.g., laptop) or a different electronic device that receives the event data (e.g., a server receiving the event data from a laptop computer).Type: ApplicationFiled: January 20, 2022Publication date: May 5, 2022Inventors: Emanuele MANDELLI, Sai Harsha JANDHYALA, Walter NISTICO
-
Patent number: 11320904Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).Type: GrantFiled: March 9, 2021Date of Patent: May 3, 2022Assignee: APPLE INC.Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
-
Publication number: 20220061660Abstract: The invention relates to a method for determining at least one parameter of two eyes (10l, 10r) of a test person (31), the method comprising the following steps: optically capturing of a first eye (10l; 10r) of the two eyes (10l, 10r) by means of a first capturing unit (3l; 3r); optically capturing the second eye (10r; 10l) of the two eyes (10l, 10r) by means of a second capturing unit (3r; 3l); transmitting first signals concerning the captured first eye (10l; 10r) from the first capturing unit (3l; 3r) to an analysis unit (27) and transmitting second sig-nals concerning the captured second eye (10r; 10l) from the second capturing unit (3r; 3l) to the analysis unit (27); determining the at least one parameter of the two eyes (10l, 10r) on the basis of the transmitted first and second signals in the analysis unit (27), characterized by the following step: setting a first data rate for the first signals and a second data rate for the sec-ond signals, wherein the first and the second data rate differ fromType: ApplicationFiled: September 9, 2021Publication date: March 3, 2022Inventors: Walter Nistico, Jan Hoffmann, Eberhard Schmidt
-
Publication number: 20210378509Abstract: Various implementations determine a pupil characteristic (e.g., perimeter location, pupil shape, pupil diameter, pupil center, etc.) using on-axis illumination. On-axis illumination involves producing light from a light source that is approximately on-axis with an image sensor configured to capture reflections of the light from the light source off the eye. The light from the light source may passes through the pupil into the eye and reflect off the retina to produce a bright pupil-type light pattern in data obtained by the image sensor. The light may be pulsed at a frequency so that frequency segmentation may be used to distinguish reflections through the pupil off the retina from reflections of light from other light sources. In some implementations, the image sensor is an event camera that detects events. The pupil characteristics may be assessed by assessing events that recur in a given area at the given frequency.Type: ApplicationFiled: August 25, 2021Publication date: December 9, 2021Inventor: Walter NISTICO
-
Publication number: 20210377465Abstract: In one implementation, an event sensor includes a plurality of pixels and an event compiler. The plurality of pixels are positioned to receive light from a scene disposed within a field of view of the event sensor. Each pixel is configured to have an operational state that is modified by control signals generated by a respective state circuit. The event compiler is configured to output a stream of pixel events. Each respective pixel event corresponds to a breach of a comparator threshold related to an intensity of incident illumination. Each control signal is generated based on feedback information that is received from an image pipeline configured to consume image data derived from the stream of pixel events.Type: ApplicationFiled: August 13, 2021Publication date: December 2, 2021Inventors: Emanuele MANDELLI, Walter NISTICO
-
Publication number: 20210377453Abstract: In one implementation, a system includes an event sensor with a pixel array and an image pipeline. The pixel array is configured to operate a first subset of pixels in an active state and a second subset of pixels in an inactive state. The event sensor is configured to output pixel events. Each respective pixel event is generated in response to a specific pixel within the first subset of pixels detecting a change in light intensity that exceeds a comparator threshold. The image pipeline is configured to consume image data derived from the pixel events and communicate feedback information to the event sensor based on the image data. The feedback information causes a pixel within the first subset of pixels to transition from the active state to another state.Type: ApplicationFiled: August 12, 2021Publication date: December 2, 2021Inventor: Walter NISTICO
-
Publication number: 20210377512Abstract: In one implementation, a method includes receiving pixel events output by an event sensor that correspond to a feature disposed within a field of view of the event sensor. Each respective pixel event is generated in response to a specific pixel within a pixel array of the event sensor detecting a change in light intensity that exceeds a comparator threshold. A characteristic of the feature is determined at a first time based on the pixel events and a previous characteristic of the feature at a second time that precedes the first time. Movement of the feature relative to the event sensor is tracked over time based on the characteristic and the previous characteristic.Type: ApplicationFiled: August 13, 2021Publication date: December 2, 2021Inventor: Walter NISTICO