Patents by Inventor Steven A. Marchette

Steven A. Marchette has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12210676
    Abstract: Various implementations disclosed herein include devices, systems, and methods that assess physiological data of a user and a scene understanding of a physical environment to determine a retention state. For example, an example process may include obtaining physiological data in an environment during a first period of time. The process may further include identifying one or more of the objects in the environment based on determining a scene understanding of the environment. The process may further include determining, based on the physiological data and the scene understanding, features associated with interaction events for the one or more objects. The process may further include determining, based on the features, a retention state during the first period of time, the retention state associated with retention of a concept associated with an object of the one or more objects. The process may further include providing feedback based on identifying the retention state.
    Type: Grant
    Filed: September 8, 2023
    Date of Patent: January 28, 2025
    Assignee: Apple Inc.
    Inventors: Srinath Nizampatnam, Mary A. Pyc, Steven A. Marchette, Izzet B. Yildiz, Grant H. Mulliken
  • Publication number: 20240212291
    Abstract: Various implementations disclosed herein include devices, systems, and methods that provide a view of an environment with a visual effect based on a determined objective. For example, an example process may include presenting a view of an environment including one or more objects during a communication session involving a second device. The process may further include determining, during the communication session, an objective of a second user of the second device corresponding to identifying a featured object in the environment. The process may further include, in accordance with the determined objective, providing a view of the environment with a visual effect.
    Type: Application
    Filed: December 7, 2023
    Publication date: June 27, 2024
    Inventors: Divine Keetle-Maloney, Steven A. Marchette, David Hobbins
  • Publication number: 20240212343
    Abstract: Various implementations disclosed herein include devices, systems, and methods that provide a view of a physical environment with a visual effect based on determining whether an object is a searched-for object. For example, an example process may include determining an objective corresponding to identifying a searched-for object in a physical environment including one or more objects. The process may further include obtaining depth-based data based on sensor data captured by one or more sensors in the physical environment. The process may further include determining, based on the determined objective and the depth-based data, whether a first object in the physical environment is the searched-for object. The process may further include, in accordance with determining whether the first object is the searched-for object, providing a view of the physical environment with a visual effect.
    Type: Application
    Filed: December 4, 2023
    Publication date: June 27, 2024
    Inventors: Divine Keetle-Maloney, Steven A. Marchette, David Hobbins
  • Publication number: 20240203276
    Abstract: In one implementation, a method of providing audience feedback during a performance of a presentation is performed at a device including a display, one or more processors, and non-transitory memory. The method includes displaying, on the display in association with an environment including a plurality of audience members, one or more slides of a presentation. The method includes, while displaying the one or more slides of the presentation, obtaining data regarding the plurality of audience members. The method includes displaying, on the display in association with the environment, one or more virtual objects based on the data regarding the plurality of audience members.
    Type: Application
    Filed: April 6, 2022
    Publication date: June 20, 2024
    Inventors: Benjamin Hylak, Aaron M. Burns, Grant H. Mulliken, Mary A. Pyc, Nathan Gitter, Pau Sastre Miguel, Steven A. Marchette
  • Publication number: 20240193858
    Abstract: In one implementation, a method of assisting in the rehearsal of a presentation is performed at a device including a display, one or more processors, and non-transitory memory. The method includes obtaining a difficulty level for a rehearsal of a presentation. The method includes displaying, on the display, one or more slides of the presentation. The method includes displaying, on the display in association with a volumetric environment, one or more virtual objects based on the difficulty level.
    Type: Application
    Filed: April 11, 2022
    Publication date: June 13, 2024
    Inventors: Benjamin Hylak, Aaron M. Burns, Grant H. Mulliken, Mary A. Pyc, Nathan Gitter, Pau Sastre Miguel, Steven A. Marchette
  • Publication number: 20240104864
    Abstract: Embodiments are directed to aspects of the extended reality environments that are selected or otherwise modified to account for distracting stimuli. Similarly, one or more metrics that reflect, or are otherwise indicative of, a user's ability to focus on a current or upcoming activity may be used to adjust a user's interaction with the extended reality environment. The extended reality environment can generated by an extended reality system that includes a head-mounted display, a set of sensors and a processor configured to enter a focus mode that reduces distraction in an extended reality environment. While in the focus mode, the process or can receive imaging data of a physical environment around a user using the set of sensors and generate the extended reality environment that includes a reproduction of the first region of the physical environment where an identified object is replaced with the additional content.
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Grant H. Mulliken, Anura A. Patil, Brian Pasley, Christine Godwin, Eve Ekman, Jonathan Hadida, Lauren Cheung, Mary A. Pyc, Patrick O. Eronini, Raphael A. Bernier, Steven A. Marchette, Fletcher Rothkopf
  • Publication number: 20240104838
    Abstract: Embodiments are directed to aspects of the extended reality environments that are selected or otherwise modified to account for distracting stimuli. Similarly, one or more metrics that reflect, or are otherwise indicative of, a user's ability to focus on a current or upcoming activity may be used to adjust a user's interaction with the extended reality environment. The extended reality environment can generated by an extended reality system that includes a head-mounted display, a set of sensors and a processor configured to enter a focus mode that reduces distraction in an extended reality environment. While in the focus mode, the process or can receive imaging data of a physical environment around a user using the set of sensors and generate the extended reality environment that includes a reproduction of the first region of the physical environment where an identified object is replaced with the additional content.
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Grant H. Mulliken, Anura A Patil, Brian Pasle, Christine Godwin, Eve Ekman, Jonathan Hadida, Lauren Cheung, Mary A. Pyc, Patrick O. Eronini, Raphael A. Bernier, Steven A Marchette, Fletcher Rothkopf
  • Publication number: 20240104792
    Abstract: Embodiments are directed to aspects of the extended reality environments that are selected or otherwise modified to account for distracting stimuli. Similarly, one or more metrics that reflect, or are otherwise indicative of, a user's ability to focus on a current or upcoming activity may be used to adjust a user's interaction with the extended reality environment. The extended reality environment can generated by an extended reality system that includes a head-mounted display, a set of sensors and a processor configured to enter a focus mode that reduces distraction in an extended reality environment. While in the focus mode, the process or can receive imaging data of a physical environment around a user using the set of sensors and generate the extended reality environment that includes a reproduction of the first region of the physical environment where an identified object is replaced with the additional content.
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Grant H. Mulliken, Anura A. Patil, Brian Pasley, Christine Godwin, Eve Ekman, Jonathan Hadida, Lauren Cheung, Mary A. Pyc, Patrick O. Eronini, Raphael A. Bernier, Steven A. Marchette, Fletcher Rothkopf