Patents by Inventor Julia Benndorf

Julia Benndorf has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240103618
    Abstract: Methods and apparatus for correcting the gaze direction and the origin (entrance pupil) in gaze tracking systems. During enrollment after an eye model is obtained, the pose of the eye when looking at a target prompt is determined. This information is used to estimate the true visual axis of the eye. The visual axis may then be used to correct the point of view (PoV) with respect to the display during use. If a clip-on lens is present, a corrected gaze axis may be calculated based on the known optical characteristics and pose of the clip-on lens. A clip-on corrected entrance pupil may then be estimated by firing two or more virtual rays through the clip-on lens to determine the intersection between the rays and the corrected gaze axis.
    Type: Application
    Filed: September 19, 2023
    Publication date: March 28, 2024
    Applicant: Apple Inc.
    Inventors: Julia Benndorf, Qichao Fan, Julian K. Shutzberg, Paul A. Lacey, Hua Gao
  • Publication number: 20240029395
    Abstract: A surface material on one or more optical components of an optical sensing system can be detected. In some embodiments, a method of detection includes generating a first image using the optical sensing system. In accordance with a determination that one or more criteria are satisfied, the one or more criteria including a criterion that is satisfied when a proportion of the first image having an intensity of light in a first intensity range is greater than a threshold, a surface material is detected on the first attachable lens.
    Type: Application
    Filed: July 21, 2023
    Publication date: January 25, 2024
    Inventors: Anup RATHI, Mark LINDER, Julia BENNDORF, Duncan A. MCROBERTS
  • Publication number: 20230417627
    Abstract: A system can perform a method to identify when one or more lenses are the wrong prescription for the user. In some embodiments, the system notifies the user to switch prescription lenses or initiates an enrollment process for the lenses. In some embodiments, the system can identify the prescription of a lens and compare the identified prescription to one or more enrolled prescriptions for the user (e.g., previously defined in user settings associated with user profile). When the identified prescription of the lens does not match an expected enrolled prescription for the user), the system can notify the user to switch prescription lenses. When there are no enrolled prescriptions for the user, the device can prompt the user to enroll the lens.
    Type: Application
    Filed: June 9, 2023
    Publication date: December 28, 2023
    Inventors: Anup RATHI, Julia BENNDORF, Zeyad ZAKY, Katharina BUCKL, Duncan A. MCROBERTS
  • Patent number: 11710350
    Abstract: Some implementations of the disclosure involve, at a device having one or more processors, one or more image sensors, and an illumination source, detecting a first attribute of an eye based on pixel differences associated with different wavelengths of light in a first image of the eye. These implementations next determine a first location associated with the first attribute in a three dimensional (3D) coordinate system based on depth information from a depth sensor. Various implementations detect a second attribute of the eye based on a glint resulting from light of the illumination source reflecting off a cornea of the eye. These implementations next determine a second location associated with the second attribute in the 3D coordinate system based on the depth information from the depth sensor, and determine a gaze direction in the 3D coordinate system based on the first location and the second location.
    Type: Grant
    Filed: October 12, 2021
    Date of Patent: July 25, 2023
    Assignee: Apple Inc.
    Inventors: Tom Sengelaub, Hua Gao, Hao Qin, Julia Benndorf
  • Publication number: 20220404916
    Abstract: A method is performed at a first device with a non-transitory memory, one or more processors, and a network interface. The method includes storing, in the non-transitory memory, reference data describing at least one reference object. The method includes receiving user data via the network interface at a time after completion of a user session of a user of a second device. The user behavior data includes a user behavior characteristic of the user of the second device at a plurality of times during the user session and respective time stamps indicative of the plurality of times during the user session. The method includes combining, using one or more processors, the user behavior data and the reference data based on the respective time stamps to generate data regarding user behavior during the user session with respect to the at least one reference object.
    Type: Application
    Filed: August 23, 2022
    Publication date: December 22, 2022
    Inventors: Arnd Rose, Tom Sengelaub, Julia Benndorf, Marvin Vogel
  • Publication number: 20220027621
    Abstract: Some implementations of the disclosure involve, at a device having one or more processors, one or more image sensors, and an illumination source, detecting a first attribute of an eye based on pixel differences associated with different wavelengths of light in a first image of the eye. These implementations next determine a first location associated with the first attribute in a three dimensional (3D) coordinate system based on depth information from a depth sensor. Various implementations detect a second attribute of the eye based on a glint resulting from light of the illumination source reflecting off a cornea of the eye. These implementations next determine a second location associated with the second attribute in the 3D coordinate system based on the depth information from the depth sensor, and determine a gaze direction in the 3D coordinate system based on the first location and the second location.
    Type: Application
    Filed: October 12, 2021
    Publication date: January 27, 2022
    Inventors: Tom Sengelaub, Hua Gao, Hao Qin, Julia Benndorf
  • Patent number: 11170212
    Abstract: Some implementations of the disclosure involve, at a device having one or more processors, one or more image sensors, and an illumination source, detecting a first attribute of an eye based on pixel differences associated with different wavelengths of light in a first image of the eye. These implementations next determine a first location associated with the first attribute in a three dimensional (3D) coordinate system based on depth information from a depth sensor. Various implementations detect a second attribute of the eye based on a glint resulting from light of the illumination source reflecting off a cornea of the eye. These implementations next determine a second location associated with the second attribute in the 3D coordinate system based on the depth information from the depth sensor, and determine a gaze direction in the 3D coordinate system based on the first location and the second location.
    Type: Grant
    Filed: September 13, 2019
    Date of Patent: November 9, 2021
    Assignee: Apple Inc.
    Inventors: Tom Sengelaub, Hua Gao, Hao Qin, Julia Benndorf
  • Publication number: 20200104589
    Abstract: Some implementations of the disclosure involve, at a device having one or more processors, one or more image sensors, and an illumination source, detecting a first attribute of an eye based on pixel differences associated with different wavelengths of light in a first image of the eye. These implementations next determine a first location associated with the first attribute in a three dimensional (3D) coordinate system based on depth information from a depth sensor. Various implementations detect a second attribute of the eye based on a glint resulting from light of the illumination source reflecting off a cornea of the eye. These implementations next determine a second location associated with the second attribute in the 3D coordinate system based on the depth information from the depth sensor, and determine a gaze direction in the 3D coordinate system based on the first location and the second location.
    Type: Application
    Filed: September 13, 2019
    Publication date: April 2, 2020
    Inventors: Tom Sengelaub, Hua Gao, Hao Qin, Julia Benndorf
  • Publication number: 20190179423
    Abstract: System and Method for providing information about a user behavior of a user with regard to at least one reference object (VRS) via a network (12) from a first device (14) to a second device (16), wherein the first device (14) is associated with the user, and the first device (14) and the second device (16) each comprise reference data (VRD), which describe the at least one reference object (VRS). The first device (14) comprises an eye tracking device (20a, 20b) that captures at least one user behavior characteristic with respect to the at least on reference object (VRS), wherein the captured at least one user behavior characteristic is provided in form of user behavior data (UD), which are transmitted from the first device (14) to the second device (16) via the network (12).
    Type: Application
    Filed: June 14, 2017
    Publication date: June 13, 2019
    Inventors: Arnd Rose, Tom Sengelaub, Julia Benndorf, Marvin Vogel