Patents by Inventor Bryce L. Schmidtchen

Bryce L. Schmidtchen has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11983316
    Abstract: In one implementation, a method is performed for selecting a UI element with eye tracking-based attention accumulators. The method includes: while a first UI element is currently selected, detecting a gaze direction directed to a second UI element; in response to detecting the gaze direction directed to the second UI element, decreasing a first attention accumulator value associated with the first UI element and increasing a second attention accumulator value associated with the second UI element; in accordance with a determination that the second attention accumulator value exceeds the first attention accumulator value, deselecting the first UI element and selecting the second UI element; and in accordance with a determination that the second attention accumulator value does not exceed the first attention accumulator value, maintaining selection of the first UI element.
    Type: Grant
    Filed: January 12, 2023
    Date of Patent: May 14, 2024
    Assignee: APPLE INC.
    Inventors: Gregory Lutter, Bryce L. Schmidtchen
  • Publication number: 20240104872
    Abstract: Various implementations provide a view of a 3D environment including a portal for viewing a stereo item (e.g., a photo or video) positioned a distance behind the portal. One or more visual effects are provided based on texture of one or more portions of the stereo item, e.g., texture at cutoff or visible edges of the stereo item. The effects change the appearance of the stereo item or the portal itself, e.g., improving visual comfort issues by minimizing window violations or otherwise enhancing the viewing experience. Various implementations provide a view of a 3D environment including an immersive view of a stereo item without using portal. Such a visual effect may be provided to partially obscure the surrounding 3D environment.
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Bryce L. Schmidtchen, Bryan Cline, Charles O. Goddard, Michael I. Weinstein, Tsao-Wei Huang, Tobias Rick, Vedant Saran, Alexander Menzies, Alexandre Da Veiga
  • Publication number: 20240104843
    Abstract: In some embodiments, a computer system facilitates depth conflict mitigation for a virtual object that is in contact with one or more physical objects in a three-dimensional environment by reducing visual prominence of one or more portions of the virtual object. In some embodiments, a computer system adjusts the visibility of one or more virtual objects in a three-dimensional environment by applying a visual effect to the one or more virtual objects in response to detecting one or more portions of a user. In some embodiments, a computer system modifies visual prominence in accordance with a level of engagement with a virtual object.
    Type: Application
    Filed: September 22, 2023
    Publication date: March 28, 2024
    Inventors: Christopher D. MCKENZIE, Benjamin HYLAK, Conner J. BROOKS, Adrian P. LINDBERG, Bryce L. SCHMIDTCHEN
  • Publication number: 20240019928
    Abstract: Various implementations disclosed herein include devices, systems, and methods for using a gaze vector and head pose information to effectuate a user interaction with a virtual object. In some implementations, a device includes a sensor for sensing a head pose of a user, a display, one or more processors, and a memory. In various implementations, a method includes displaying a set of virtual objects. Based on a gaze vector, it is determined that a gaze of the user is directed to a first virtual object of the set of virtual objects. A head pose value corresponding to the head pose of the user is obtained. An action relative to the first virtual object is performed based on the head pose value satisfying a head pose criterion.
    Type: Application
    Filed: September 28, 2023
    Publication date: January 18, 2024
    Inventors: Thomas G. Salter, Anshu K. Chimalamarri, Bryce L. Schmidtchen, Devin W. Chalmers, Gregory Lutter
  • Publication number: 20230394755
    Abstract: A method includes presenting a representation of a three-dimensional (3D) environment from a current point-of-view. The method includes identifying a region of interest within the 3D environment. The region of interest is located at a first distance from the current point-of-view. The method includes receiving, via the audio sensor, an audible signal and converting the audible signal to audible signal data. The method includes displaying, on the display, a visual representation of the audible signal data at a second distance from the current point-of-view that is a function of the first distance between the region of interest and the current point-of-view.
    Type: Application
    Filed: May 31, 2023
    Publication date: December 7, 2023
    Inventors: Ioana Negoita, Alesha Unpingco, Bryce L. Schmidtchen, Devin W. Chalmers, Lee Sparks, Thomas J. Moore
  • Publication number: 20230386093
    Abstract: A method is performed at an electronic device with one or more processors, a non-transitory memory, and a display. The method includes, while displaying, on the display, the computer-generated content according to a first locked mode, determining that the electronic device changes from a first distance to a physical surface to a second distance from the physical surface. The method includes, in accordance with a determination that the second distance satisfies a locked mode change criterion, changing display of the computer-generated content from the first locked mode to a second locked mode. The method includes, in accordance with a determination that the second distance does not satisfy the locked mode change criterion, maintaining display of the computer-generated content according to the first locked mode. Examples of the locked mode change criterion include an occlusion criterion and a remoteness criterion.
    Type: Application
    Filed: May 22, 2023
    Publication date: November 30, 2023
    Inventors: Gregory Lutter, Bryce L. Schmidtchen, Rahul Nair
  • Publication number: 20230336865
    Abstract: The present disclosure generally relates to techniques and user interfaces for capturing media, displaying a preview of media, displaying a recording indicator, displaying a camera user interface, and/or displaying previously captured media.
    Type: Application
    Filed: November 22, 2022
    Publication date: October 19, 2023
    Inventors: Alexandre DA VEIGA, Lee S. Broughton, Angel Suet Yan CHEUNG, Stephen O. LEMAY, Chia Yang LIN, Behkish J. MANZARI, Ivan MARKOVIC, Alexander MENZIES, Aaron MORING, Jonathan RAVASZ, Tobias RICK, Bryce L. SCHMIDTCHEN, William A. SORRENTINO, III
  • Publication number: 20230333642
    Abstract: A method includes displaying a plurality of visual elements. The method includes determining, based on respective characteristic values of the plurality of visual elements, an expected gaze target that indicates a first display region where a user of the device intends to gaze while the plurality of visual elements is being displayed. The method includes obtaining, via the image sensor, an image that includes a set of pixels corresponding to a pupil of the user of the device. The method includes determining, by a gaze tracker, based on the set of pixels corresponding to the pupil, a measured gaze target that indicates a second display region where the user is measuredly gazing. The method includes adjusting a calibration parameter of the gaze tracker based on a difference between the first display region indicated by the expected gaze target and the second display region indicated by the measured gaze target.
    Type: Application
    Filed: March 1, 2023
    Publication date: October 19, 2023
    Inventors: Anshu K. Chimalamarri, Bryce L. Schmidtchen, Gregory Lutter
  • Publication number: 20230333643
    Abstract: A method is performed at an electronic device with one or more processors, a non-transitory memory, an eye tracker, and a display. The eye tracker receives eye tracking data associated with one or more eyes of a user of the electronic device. The method includes displaying, on the display, a first user interface (UI) element that is associated with a first selection region and a second selection region. While displaying the first UI element, the method includes determining, based on the eye tracking data, that a first targeting criterion is satisfied with respect to the first selection region, and determining, based on the eye tracking data, that a second targeting criterion is satisfied with respect to the second selection region. The method includes selecting the first UI element based at least in part on determining that the first targeting criterion is satisfied and the second targeting criterion is satisfied.
    Type: Application
    Filed: March 6, 2023
    Publication date: October 19, 2023
    Inventors: Bryce L. Schmidtchen, Ioana Negoita, Anshu K. Chimalamarri, Gregory Lutter, Thomas J. Moore, Trevor J. McIntyre
  • Publication number: 20230244308
    Abstract: In one implementation, a method is performed for selecting a UI element with eye tracking-based attention accumulators. The method includes: while a first UI element is currently selected, detecting a gaze direction directed to a second UI element; in response to detecting the gaze direction directed to the second UI element, decreasing a first attention accumulator value associated with the first UI element and increasing a second attention accumulator value associated with the second UI element; in accordance with a determination that the second attention accumulator value exceeds the first attention accumulator value, deselecting the first UI element and selecting the second UI element; and in accordance with a determination that the second attention accumulator value does not exceed the first attention accumulator value, maintaining selection of the first UI element.
    Type: Application
    Filed: January 12, 2023
    Publication date: August 3, 2023
    Inventors: Gregory Lutter, Bryce L. Schmidtchen