Patents by Inventor Andrii Nikiforov

Andrii Nikiforov has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240069631
    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).
    Type: Application
    Filed: May 5, 2023
    Publication date: February 29, 2024
    Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
  • Patent number: 11644896
    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).
    Type: Grant
    Filed: March 29, 2022
    Date of Patent: May 9, 2023
    Assignee: APPLE INC.
    Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
  • Publication number: 20220214747
    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).
    Type: Application
    Filed: March 29, 2022
    Publication date: July 7, 2022
    Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
  • Patent number: 11320904
    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).
    Type: Grant
    Filed: March 9, 2021
    Date of Patent: May 3, 2022
    Assignee: APPLE INC.
    Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
  • Publication number: 20210223862
    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).
    Type: Application
    Filed: March 9, 2021
    Publication date: July 22, 2021
    Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
  • Patent number: 10976813
    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).
    Type: Grant
    Filed: June 12, 2017
    Date of Patent: April 13, 2021
    Assignee: APPLE INC.
    Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky
  • Publication number: 20190129501
    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; ?; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S).
    Type: Application
    Filed: June 12, 2017
    Publication date: May 2, 2019
    Inventors: Walter Nistico, Andrii Nikiforov, Borys Lysyansky