Patents by Inventor Ian Erkelens
Ian Erkelens has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250104581Abstract: Techniques for enhancing brightness and power efficiency in augmented reality (AR) display devices are described. To enhance brightness and power efficiency a near-eye display device's duty cycle may be adjusted based on a type of usage (e.g., world-locked or head-locked); a persistence or a display rate may be varied based on head movement speed; the brightness may be adjusted based on sensed ambient light; ambient brightness may be matched by utilizing camera-based scene understanding determining a user's gaze direction and then decreasing brightness of the gaze's peripheral; the display brightness may be controlled based on gaze and/or eye motion; and/or if the user is moving, world-locked rendering (WLR) targets and/or refresh rates may be altered, or WLR varied for peripheral content based on a comparison of gaze direction and a virtual object location.Type: ApplicationFiled: September 27, 2023Publication date: March 27, 2025Applicant: Meta Platforms Technologies, LLCInventors: Ajit NINAN, Thomas Scott MURDISON, Romain BACHY, Ian ERKELENS
-
Patent number: 12235461Abstract: Aspects of the present disclosure are directed to controlling optical parameters at a user's eye using an artificial reality system and tracked user conditions. For example, based on tracked user eye positioning, implementations can adjust the light that enters the user's eye to control an image shell generated at the user's eye. An image shell refers to the way light, that enters the eye, focuses on the retina of the eye. Example properties of an image shell include image shell centration, image shell curvature, image shell shape, etc. A user may focus on an object in an artificial reality environment (e.g., real-world object or virtual object) and light from the object can generate an image shell at the user's eyes. The image shell at the user's eyes can impact the user's vision and/or eye biology.Type: GrantFiled: May 19, 2023Date of Patent: February 25, 2025Assignee: Meta Platforms Technologies, LLCInventors: William Aaron Nicholls, Barry David Silverstein, Robin Sharma, Ian Erkelens
-
Patent number: 12236011Abstract: Systems, methods, and non-transitory computer-readable media are disclosed for selectively rendering augmented reality content based on predictions regarding a user's ability to visually process the augmented reality content. For instance, the disclosed systems can identify eye tracking information for a user at an initial time. Moreover, the disclosed systems can predict a change in an ability of the user to visually process an augmented reality element at a future time based on the eye tracking information. Additionally, the disclosed systems can selectively render the augmented reality element at the future time based on the predicted change in the ability of the user to visually process the augmented reality element.Type: GrantFiled: December 28, 2022Date of Patent: February 25, 2025Assignee: Meta Platforms, Inc.Inventors: Mark Terrano, Ian Erkelens, Kevin James MacKenzie
-
Publication number: 20230375844Abstract: Aspects of the present disclosure are directed to controlling optical parameters at a user's eye using an artificial reality system and tracked user conditions. For example, based on tracked user eye positioning, implementations can adjust the light that enters the user's eye to control an image shell generated at the user's eye. An image shell refers to the way light, that enters the eye, focuses on the retina of the eye. Example properties of an image shell include image shell centration, image shell curvature, image shell shape, etc. A user may focus on an object in an artificial reality environment (e.g., real-world object or virtual object) and light from the object can generate an image shell at the user's eyes. The image shell at the user's eyes can impact the user's vision and/or eye biology.Type: ApplicationFiled: May 19, 2023Publication date: November 23, 2023Inventors: William Aaron NICHOLLS, Barry David SILVERSTEIN, Robin SHARMA, Ian ERKELENS
-
Publication number: 20230377493Abstract: Systems and methods dynamically control optical conditions presented to a user by an artificial reality system according to monitored visual experience parameters for the user. For example, the artificial reality presentation to the user can be tracked to monitor visual experience parameters, such as light characteristics (e.g., color), focal distances, virtual object characteristics (e.g., objects/text color, size, etc.), aggregated defocus distance of background, luminance, activity, eye movement, accommodation distances, and other suitable conditions. Implementations can vary optical conditions presented/displayed by the artificial reality system according to the monitoring by altering the focal distance for virtual objects, text size, text/background color, light characteristics, and other suitable optical conditions. In some examples, user preferences for optical conditions can be determined according to the monitoring.Type: ApplicationFiled: May 19, 2023Publication date: November 23, 2023Inventors: William Aaron NICHOLLS, Barry David SILVERSTEIN, Robin SHARMA, Ian ERKELENS
-
Patent number: 11650421Abstract: A method may include identifying, by one or more processors, an object in a field of view of a wearable display, where the object is identified for a presbyopic compensation. The presbyopic compensation is performed by the one or more processors on image data of the object to generate compensated image data of the object. The one or more processors render an image in response to the compensated image data of the object on a display of the wearable display.Type: GrantFiled: May 23, 2019Date of Patent: May 16, 2023Assignee: Meta Platforms Technologies, LLCInventors: Ian Erkelens, Larry Richard Moore, Jr., Kevin James MacKenzie
-
Publication number: 20230057524Abstract: Eyeglass devices may include a frame shaped and sized to be worn by a user at least partially in front of the user's eyes, a varifocal optical element mounted to the frame, and an eye-tracking element mounted to the frame. The varifocal optical element may include a substantially transparent actuator positioned at least partially within an optical aperture of the varifocal optical element and configured to alter a shape of the varifocal optical element upon actuation. The eye-tracking element may be configured to track at least a gaze direction of the user's eyes, and the varifocal optical element may be configured to change, based on information from the eye-tracking element, in at least one optical property including a focal distance. Various other devices, systems, and methods are also disclosed.Type: ApplicationFiled: August 9, 2022Publication date: February 23, 2023Inventors: Christopher Stipe, Marina Zannoli, Ian Erkelens, Andrew John Ouderkirk, Spencer Allan Wells, Eugene Cho, John Cooke, Robin Sharma, Jonathan Robert Peterson
-
Patent number: 11579689Abstract: Systems, methods, and non-transitory computer-readable media are disclosed for selectively rendering augmented reality content based on predictions regarding a user's ability to visually process the augmented reality content. For instance, the disclosed systems can identify eye tracking information for a user at an initial time. Moreover, the disclosed systems can predict a change in an ability of the user to visually process an augmented reality element at a future time based on the eye tracking information. Additionally, the disclosed systems can selectively render the augmented reality element at the future time based on the predicted change in the ability of the user to visually process the augmented reality element.Type: GrantFiled: January 25, 2021Date of Patent: February 14, 2023Assignee: Meta Platforms, Inc.Inventors: Mark Terrano, Ian Erkelens, Kevin James MacKenzie
-
Publication number: 20230037329Abstract: Head-mounted display systems may include an eye-tracking subsystem and a fixation distance prediction subsystem. The eye-tracking subsystem may be configured to determine at least a gaze direction of a user's eyes and an eye movement speed of the user's eyes. The fixation distance prediction subsystem may be configured to predict, based on the eye movement speed and the gaze direction of the user's eyes, a fixation distance at which the user's eyes will become fixated prior to the user's eyes reaching a fixation state associated with the predicted fixation distance. Additional methods, systems, and devices are also disclosed.Type: ApplicationFiled: July 7, 2022Publication date: February 9, 2023Inventors: Ian Erkelens, Thomas Scott Murdison, Kevin James MacKenzie
-
Patent number: 11567326Abstract: A device includes a light source configured to emit an image light. The device also includes an optical assembly configured to direct the image light to an eye-box of the device. The optical assembly includes a first optical element portion configured to focus a first portion of the image light propagating through the first optical element portion. The optical assembly also includes a second optical element portion configured to focus a second portion of the image light propagating through the second optical element portion. The second optical element portion includes a liquid crystal (“LC”) lens having an adjustable optical power.Type: GrantFiled: October 30, 2020Date of Patent: January 31, 2023Assignee: META PLATFORMS TECHNOLOGIES, LLCInventors: Afsoon Jamali, Yang Zhao, Brian Wheelwright, Douglas Robert Lanman, Marina Zannoli, Ian Erkelens, Yusufu Njoni Bamaxam Sulai, Lu Lu, Jacques Gollier
-
Publication number: 20210223861Abstract: Systems, methods, and non-transitory computer-readable media are disclosed for selectively rendering augmented reality content based on predictions regarding a user's ability to visually process the augmented reality content. For instance, the disclosed systems can identify eye tracking information for a user at an initial time. Moreover, the disclosed systems can predict a change in an ability of the user to visually process an augmented reality element at a future time based on the eye tracking information. Additionally, the disclosed systems can selectively render the augmented reality element at the future time based on the predicted change in the ability of the user to visually process the augmented reality element.Type: ApplicationFiled: January 25, 2021Publication date: July 22, 2021Inventors: Mark Terrano, Ian Erkelens, Kevin James MacKenzie
-
Patent number: 10901502Abstract: Systems, methods, and non-transitory computer-readable media are disclosed for selectively rendering augmented reality content based on predictions regarding a user's ability to visually process the augmented reality content. For instance, the disclosed systems can identify eye tracking information for a user at an initial time. Moreover, the disclosed systems can predict a change in an ability of the user to visually process an augmented reality element at a future time based on the eye tracking information. Additionally, the disclosed systems can selectively render the augmented reality element at the future time based on the predicted change in the ability of the user to visually process the augmented reality element.Type: GrantFiled: June 27, 2019Date of Patent: January 26, 2021Assignee: FACEBOOK, INC.Inventors: Mark Terrano, Ian Erkelens, Kevin James MacKenzie
-
Publication number: 20200409457Abstract: Systems, methods, and non-transitory computer-readable media are disclosed for selectively rendering augmented reality content based on predictions regarding a user's ability to visually process the augmented reality content. For instance, the disclosed systems can identify eye tracking information for a user at an initial time. Moreover, the disclosed systems can predict a change in an ability of the user to visually process an augmented reality element at a future time based on the eye tracking information. Additionally, the disclosed systems can selectively render the augmented reality element at the future time based on the predicted change in the ability of the user to visually process the augmented reality element.Type: ApplicationFiled: June 27, 2019Publication date: December 31, 2020Inventors: Mark Terrano, Ian Erkelens, Kevin James MacKenzie