Patents by Inventor Alexander G. Berardino
Alexander G. Berardino has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250104580Abstract: Various implementations disclosed herein include devices, systems, and methods that present content items (e.g., movies, TV shows, home-made videos, etc.) on electronic devices such as HMDs. Some implementations adjust what is being displayed by the electronic devices to mitigate optical module-based artifacts (e.g., ghosting). For example, in an HMD with a catadioptric lens, a mirror layer may leak some light to produce ghosting artifacts that may be mitigated by adjusting brightness, dynamic range, contrast, light-spill, color, etc. Some implementations utilize adjustments that are based on content item awareness (e.g., adjustments based on the peak brightness of the scene in a movie that is being displayed within an extended reality (XR) environment, etc.). Some implementations provide adjustments based on environment awareness (e.g., how dark is the surroundings or pass-through environment) and/or optical module modeling.Type: ApplicationFiled: September 9, 2024Publication date: March 27, 2025Inventors: Stanley K. Melax, Seyedpooya Mirhosseini, Fuyi Yang, Seyedkoosha Mirhosseini, Dagny Fleischman, David M. Cook, Yashas Rai Kurlethimar, Xin Wang, Travis W. Brown, Ara H. Aroyan, Jin Wook Chang, Abbas Haddadi, Yang Li, Alexander G. Berardino, Mengu Sukan, Ermal Dreshaj, Kyrollos Yanny, William W. Sprague
-
Publication number: 20240402800Abstract: Various implementations disclosed herein include devices, systems, and methods that interpret user activity as user interactions with user interface (UI) elements positioned within a three-dimensional (3D) space such as an extended reality (XR) environment. Some implementations enable user interactions with virtual elements displayed in 3D environments that utilize alternative input modalities, e.g., XR environments that interpret user activity as either direct interactions or indirect interactions with virtual elements.Type: ApplicationFiled: May 29, 2024Publication date: December 5, 2024Inventors: Julian K. Shutzberg, David J. Meyer, David M. Teitelbaum, Mehmet N. Agaoglu, Ian R. Fasel, Chase B. Lortie, Daniel J. Brewer, Tim H. Cornelissen, Leah M. Gum, Alexander G. Berardino, Lorenzo Soto Doblado, Vinay Chawda, Itay Bar Yosef, Dror Irony, Eslam A. Mostafa, Guy Engelhard, Paul A. Lacey, Ashwin Kumar Asoka Kumar Shenoi, Bhavin Vinodkumar Nayak, Liuhao Ge, Lucas Soffer, Victor Belyaev, Bharat C. Dandu, Matthias M. Schroeder, Yirong Tang
-
Publication number: 20240402802Abstract: Various implementations disclosed herein include devices, systems, and methods that adjust a brightness characteristic of virtual content (e.g., virtual objects) and/or real content (e.g., passthrough video) in views of an XR environment provided by a head mounted device (HMD). The brightness characteristic may be adjusted based on determining a viewing state (e.g., a user's eye perception/adaptation state). A viewing state, such as a user's eye perception/adaptation state while viewing a view of an XR environment via an HMD, may respond to a brightness characteristic of the XR environment that the user is seeing, which is not necessarily the brightness characteristic of the physical environment upon which the view is wholly or partially based.Type: ApplicationFiled: June 4, 2024Publication date: December 5, 2024Inventors: Travis W. BROWN, Seyedkoosha MIRHOSSEINI, John Samuel BUSHELL, Alexander G. BERARDINO, David M. COOK, Jim J. TILANDER, Ryan W. BAKER
-
Publication number: 20240393876Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.Type: ApplicationFiled: July 31, 2024Publication date: November 28, 2024Inventors: Vinay Chawda, Mehmet N. Agaoglu, Leah M. Gum, Paul A. Lacey, Julian K. Shutzberg, Tim H. Cornelissen, Alexander G. Berardino
-
Publication number: 20240355246Abstract: A head-mounted device may include a camera that captures a live video feed of an environment. A display in the head-mounted device may display passthrough display content that includes the live video feed. Control circuitry may dynamically adjust a maximum allowable brightness for the passthrough display content during operation of the head-mounted device. Upon an initial donning of the head-mounted device, the passthrough display content may be permitted to use most or all of the achievable brightness range of the display. After a given time period, the user may be adapted to the brightness range of the display and the maximum allowable brightness for the passthrough display content may be reduced to allow additional headroom for rendered display content. The control circuitry may continue to adjust tone mappings for passthrough display content and rendered display content based on whether the display content favors real-world content or virtual content.Type: ApplicationFiled: July 2, 2024Publication date: October 24, 2024Inventors: Elizabeth Pieri, Teun R Baar, Alexander G Berardino, David M Cook, Peng Liu, Zuo Xia
-
Patent number: 12099653Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.Type: GrantFiled: September 11, 2023Date of Patent: September 24, 2024Assignee: APPLE INC.Inventors: Vinay Chawda, Mehmet N. Agaoglu, Leah M. Gum, Paul A. Lacey, Julian K. Shutzberg, Tim H. Cornelissen, Alexander G. Berardino
-
Patent number: 12067909Abstract: A head-mounted device may include a camera that captures a live video feed of an environment. A display in the head-mounted device may display passthrough display content that includes the live video feed. Control circuitry may dynamically adjust a maximum allowable brightness for the passthrough display content during operation of the head-mounted device. Upon an initial donning of the head-mounted device, the passthrough display content may be permitted to use most or all of the achievable brightness range of the display. After a given time period, the user may be adapted to the brightness range of the display and the maximum allowable brightness for the passthrough display content may be reduced to allow additional headroom for rendered display content. The control circuitry may continue to adjust tone mappings for passthrough display content and rendered display content based on whether the display content favors real-world content or virtual content.Type: GrantFiled: October 19, 2023Date of Patent: August 20, 2024Assignee: Apple Inc.Inventors: Elizabeth Pieri, Teun R Baar, Alexander G Berardino, David M Cook, Peng Liu, Zuo Xia
-
Publication number: 20240203306Abstract: A head-mounted device may include a camera that captures a live video feed of an environment. A display in the head-mounted device may display passthrough display content that includes the live video feed. Control circuitry may dynamically adjust a maximum allowable brightness for the passthrough display content during operation of the head-mounted device. Upon an initial donning of the head-mounted device, the passthrough display content may be permitted to use most or all of the achievable brightness range of the display. After a given time period, the user may be adapted to the brightness range of the display and the maximum allowable brightness for the passthrough display content may be reduced to allow additional headroom for rendered display content. The control circuitry may continue to adjust tone mappings for passthrough display content and rendered display content based on whether the display content favors real-world content or virtual content.Type: ApplicationFiled: October 19, 2023Publication date: June 20, 2024Inventors: Elizabeth Pieri, Teun R Baar, Alexander G Berardino, David M Cook, Peng Liu, Zuo Xia
-
Publication number: 20230418372Abstract: Various implementations disclosed herein include devices, systems, and methods that determine a gaze behavior state to identify gaze shifting events, gaze holding events, and loss events of a user based on physiological data. For example, an example process may include obtaining eye data associated with a gaze during a first period of time (e.g., eye position and velocity, interpupillary distance, pupil diameters, etc.). The process may further include obtaining head data associated with the gaze during the first period of time (e.g., head position and velocity). The process may further include determining a first gaze behavior state during the first period of time to identify gaze shifting events, gaze holding events, and loss events (e.g., one or more gaze and head pose characteristics may be determined, aggregated, and used to classify the user's eye movement state using machine learning techniques).Type: ApplicationFiled: June 20, 2023Publication date: December 28, 2023Inventors: Mehmet N. Agaoglu, Andrew B. Watson, Tim H. Cornelissen, Alexander G. Berardino
-
Patent number: 10890968Abstract: An electronic device may have a foveated display, an eye-tracking system and a head movement detection system. The eye-tracking system may gather information on a user's point of regard on the display while the head movement detection system may capture information regarding the rotation of the observer's head. Based on the point-of-regard information, head rotation information, image data, the type of eye/head movement that is underway, and/or tiredness information, control circuitry in the electronic device may produce image data for a display, with areas of different resolutions and(or) visual quality. A full-resolution and(or) quality portion of the image may overlap the point of regard. One or more lower resolution portions of the image may surround the full-resolution portion. The control circuitry may include a gaze prediction system for predicting the movement of the user's gaze during a saccade.Type: GrantFiled: April 5, 2019Date of Patent: January 12, 2021Assignee: Apple Inc.Inventors: Yashas Rai Kurlethimar, Andrew B. Watson, Nicolas P. Bonnier, Mehmet N. Agaoglu, Alexander G. Berardino, Elijah H. Kleeman
-
Publication number: 20190339770Abstract: An electronic device may have a foveated display, an eye-tracking system and a head movement detection system. The eye-tracking system may gather information on a user's point of regard on the display while the head movement detection system may capture information regarding the rotation of the observer's head. Based on the point-of-regard information, head rotation information, image data, the type of eye/head movement that is underway, and/or tiredness information, control circuitry in the electronic device may produce image data for a display, with areas of different resolutions and(or) visual quality. A full-resolution and(or) quality portion of the image may overlap the point of regard. One or more lower resolution portions of the image may surround the full-resolution portion. The control circuitry may include a gaze prediction system for predicting the movement of the user's gaze during a saccade.Type: ApplicationFiled: April 5, 2019Publication date: November 7, 2019Inventors: Yashas Rai Kurlethimar, Andrew B. Watson, Nicolas P. Bonnier, Mehmet N. Agaoglu, Alexander G. Berardino, Elijah H. Kleeman