Patents by Inventor Ian Erkelens

Ian Erkelens has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230377493
    Abstract: Systems and methods dynamically control optical conditions presented to a user by an artificial reality system according to monitored visual experience parameters for the user. For example, the artificial reality presentation to the user can be tracked to monitor visual experience parameters, such as light characteristics (e.g., color), focal distances, virtual object characteristics (e.g., objects/text color, size, etc.), aggregated defocus distance of background, luminance, activity, eye movement, accommodation distances, and other suitable conditions. Implementations can vary optical conditions presented/displayed by the artificial reality system according to the monitoring by altering the focal distance for virtual objects, text size, text/background color, light characteristics, and other suitable optical conditions. In some examples, user preferences for optical conditions can be determined according to the monitoring.
    Type: Application
    Filed: May 19, 2023
    Publication date: November 23, 2023
    Inventors: William Aaron NICHOLLS, Barry David SILVERSTEIN, Robin SHARMA, Ian ERKELENS
  • Publication number: 20230375844
    Abstract: Aspects of the present disclosure are directed to controlling optical parameters at a user's eye using an artificial reality system and tracked user conditions. For example, based on tracked user eye positioning, implementations can adjust the light that enters the user's eye to control an image shell generated at the user's eye. An image shell refers to the way light, that enters the eye, focuses on the retina of the eye. Example properties of an image shell include image shell centration, image shell curvature, image shell shape, etc. A user may focus on an object in an artificial reality environment (e.g., real-world object or virtual object) and light from the object can generate an image shell at the user's eyes. The image shell at the user's eyes can impact the user's vision and/or eye biology.
    Type: Application
    Filed: May 19, 2023
    Publication date: November 23, 2023
    Inventors: William Aaron NICHOLLS, Barry David SILVERSTEIN, Robin SHARMA, Ian ERKELENS
  • Patent number: 11650421
    Abstract: A method may include identifying, by one or more processors, an object in a field of view of a wearable display, where the object is identified for a presbyopic compensation. The presbyopic compensation is performed by the one or more processors on image data of the object to generate compensated image data of the object. The one or more processors render an image in response to the compensated image data of the object on a display of the wearable display.
    Type: Grant
    Filed: May 23, 2019
    Date of Patent: May 16, 2023
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Ian Erkelens, Larry Richard Moore, Jr., Kevin James MacKenzie
  • Publication number: 20230057524
    Abstract: Eyeglass devices may include a frame shaped and sized to be worn by a user at least partially in front of the user's eyes, a varifocal optical element mounted to the frame, and an eye-tracking element mounted to the frame. The varifocal optical element may include a substantially transparent actuator positioned at least partially within an optical aperture of the varifocal optical element and configured to alter a shape of the varifocal optical element upon actuation. The eye-tracking element may be configured to track at least a gaze direction of the user's eyes, and the varifocal optical element may be configured to change, based on information from the eye-tracking element, in at least one optical property including a focal distance. Various other devices, systems, and methods are also disclosed.
    Type: Application
    Filed: August 9, 2022
    Publication date: February 23, 2023
    Inventors: Christopher Stipe, Marina Zannoli, Ian Erkelens, Andrew John Ouderkirk, Spencer Allan Wells, Eugene Cho, John Cooke, Robin Sharma, Jonathan Robert Peterson
  • Patent number: 11579689
    Abstract: Systems, methods, and non-transitory computer-readable media are disclosed for selectively rendering augmented reality content based on predictions regarding a user's ability to visually process the augmented reality content. For instance, the disclosed systems can identify eye tracking information for a user at an initial time. Moreover, the disclosed systems can predict a change in an ability of the user to visually process an augmented reality element at a future time based on the eye tracking information. Additionally, the disclosed systems can selectively render the augmented reality element at the future time based on the predicted change in the ability of the user to visually process the augmented reality element.
    Type: Grant
    Filed: January 25, 2021
    Date of Patent: February 14, 2023
    Assignee: Meta Platforms, Inc.
    Inventors: Mark Terrano, Ian Erkelens, Kevin James MacKenzie
  • Publication number: 20230037329
    Abstract: Head-mounted display systems may include an eye-tracking subsystem and a fixation distance prediction subsystem. The eye-tracking subsystem may be configured to determine at least a gaze direction of a user's eyes and an eye movement speed of the user's eyes. The fixation distance prediction subsystem may be configured to predict, based on the eye movement speed and the gaze direction of the user's eyes, a fixation distance at which the user's eyes will become fixated prior to the user's eyes reaching a fixation state associated with the predicted fixation distance. Additional methods, systems, and devices are also disclosed.
    Type: Application
    Filed: July 7, 2022
    Publication date: February 9, 2023
    Inventors: Ian Erkelens, Thomas Scott Murdison, Kevin James MacKenzie
  • Patent number: 11567326
    Abstract: A device includes a light source configured to emit an image light. The device also includes an optical assembly configured to direct the image light to an eye-box of the device. The optical assembly includes a first optical element portion configured to focus a first portion of the image light propagating through the first optical element portion. The optical assembly also includes a second optical element portion configured to focus a second portion of the image light propagating through the second optical element portion. The second optical element portion includes a liquid crystal (“LC”) lens having an adjustable optical power.
    Type: Grant
    Filed: October 30, 2020
    Date of Patent: January 31, 2023
    Assignee: META PLATFORMS TECHNOLOGIES, LLC
    Inventors: Afsoon Jamali, Yang Zhao, Brian Wheelwright, Douglas Robert Lanman, Marina Zannoli, Ian Erkelens, Yusufu Njoni Bamaxam Sulai, Lu Lu, Jacques Gollier
  • Publication number: 20210223861
    Abstract: Systems, methods, and non-transitory computer-readable media are disclosed for selectively rendering augmented reality content based on predictions regarding a user's ability to visually process the augmented reality content. For instance, the disclosed systems can identify eye tracking information for a user at an initial time. Moreover, the disclosed systems can predict a change in an ability of the user to visually process an augmented reality element at a future time based on the eye tracking information. Additionally, the disclosed systems can selectively render the augmented reality element at the future time based on the predicted change in the ability of the user to visually process the augmented reality element.
    Type: Application
    Filed: January 25, 2021
    Publication date: July 22, 2021
    Inventors: Mark Terrano, Ian Erkelens, Kevin James MacKenzie
  • Patent number: 10901502
    Abstract: Systems, methods, and non-transitory computer-readable media are disclosed for selectively rendering augmented reality content based on predictions regarding a user's ability to visually process the augmented reality content. For instance, the disclosed systems can identify eye tracking information for a user at an initial time. Moreover, the disclosed systems can predict a change in an ability of the user to visually process an augmented reality element at a future time based on the eye tracking information. Additionally, the disclosed systems can selectively render the augmented reality element at the future time based on the predicted change in the ability of the user to visually process the augmented reality element.
    Type: Grant
    Filed: June 27, 2019
    Date of Patent: January 26, 2021
    Assignee: FACEBOOK, INC.
    Inventors: Mark Terrano, Ian Erkelens, Kevin James MacKenzie
  • Publication number: 20200409457
    Abstract: Systems, methods, and non-transitory computer-readable media are disclosed for selectively rendering augmented reality content based on predictions regarding a user's ability to visually process the augmented reality content. For instance, the disclosed systems can identify eye tracking information for a user at an initial time. Moreover, the disclosed systems can predict a change in an ability of the user to visually process an augmented reality element at a future time based on the eye tracking information. Additionally, the disclosed systems can selectively render the augmented reality element at the future time based on the predicted change in the ability of the user to visually process the augmented reality element.
    Type: Application
    Filed: June 27, 2019
    Publication date: December 31, 2020
    Inventors: Mark Terrano, Ian Erkelens, Kevin James MacKenzie