Patents by Inventor Mary A. Pyc

Mary A. Pyc has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12210676
    Abstract: Various implementations disclosed herein include devices, systems, and methods that assess physiological data of a user and a scene understanding of a physical environment to determine a retention state. For example, an example process may include obtaining physiological data in an environment during a first period of time. The process may further include identifying one or more of the objects in the environment based on determining a scene understanding of the environment. The process may further include determining, based on the physiological data and the scene understanding, features associated with interaction events for the one or more objects. The process may further include determining, based on the features, a retention state during the first period of time, the retention state associated with retention of a concept associated with an object of the one or more objects. The process may further include providing feedback based on identifying the retention state.
    Type: Grant
    Filed: September 8, 2023
    Date of Patent: January 28, 2025
    Assignee: Apple Inc.
    Inventors: Srinath Nizampatnam, Mary A. Pyc, Steven A. Marchette, Izzet B. Yildiz, Grant H. Mulliken
  • Publication number: 20240402807
    Abstract: Some implementations disclosed herein present a computer-generated reality (CGR) environment in which a user participates in an activity, identify a cognitive state of the user (e.g., working, learning, resting, etc.) based on data regarding the user's body (e.g., facial expressions, hand movements, physiological data, etc.), and update the environment with a surrounding environment that promotes a cognitive state of the user for the activity.
    Type: Application
    Filed: August 16, 2024
    Publication date: December 5, 2024
    Inventors: Izzet B. Yildiz, Mary A. Pyc, Ronald J. Guglielmone, JR., Sterling R. Crispin, Mikaela D. Estep, Grant H. Mulliken
  • Patent number: 12093457
    Abstract: Some implementations disclosed herein present a computer-generated reality (CGR) environment in which a user participates in an activity, identify a cognitive state of the user (e.g., working, learning, resting, etc.) based on data regarding the user's body (e.g., facial expressions, hand movements, physiological data, etc.), and update the environment with a surrounding environment that promotes a cognitive state of the user for the activity.
    Type: Grant
    Filed: September 7, 2023
    Date of Patent: September 17, 2024
    Assignee: Apple Inc.
    Inventors: Izzet B. Yildiz, Mary A. Pyc, Ronald J. Guglielmone, Jr., Sterling R. Crispin, Mikaela D. Estep, Grant H. Mulliken
  • Publication number: 20240203276
    Abstract: In one implementation, a method of providing audience feedback during a performance of a presentation is performed at a device including a display, one or more processors, and non-transitory memory. The method includes displaying, on the display in association with an environment including a plurality of audience members, one or more slides of a presentation. The method includes, while displaying the one or more slides of the presentation, obtaining data regarding the plurality of audience members. The method includes displaying, on the display in association with the environment, one or more virtual objects based on the data regarding the plurality of audience members.
    Type: Application
    Filed: April 6, 2022
    Publication date: June 20, 2024
    Inventors: Benjamin Hylak, Aaron M. Burns, Grant H. Mulliken, Mary A. Pyc, Nathan Gitter, Pau Sastre Miguel, Steven A. Marchette
  • Publication number: 20240193858
    Abstract: In one implementation, a method of assisting in the rehearsal of a presentation is performed at a device including a display, one or more processors, and non-transitory memory. The method includes obtaining a difficulty level for a rehearsal of a presentation. The method includes displaying, on the display, one or more slides of the presentation. The method includes displaying, on the display in association with a volumetric environment, one or more virtual objects based on the difficulty level.
    Type: Application
    Filed: April 11, 2022
    Publication date: June 13, 2024
    Inventors: Benjamin Hylak, Aaron M. Burns, Grant H. Mulliken, Mary A. Pyc, Nathan Gitter, Pau Sastre Miguel, Steven A. Marchette
  • Publication number: 20240104864
    Abstract: Embodiments are directed to aspects of the extended reality environments that are selected or otherwise modified to account for distracting stimuli. Similarly, one or more metrics that reflect, or are otherwise indicative of, a user's ability to focus on a current or upcoming activity may be used to adjust a user's interaction with the extended reality environment. The extended reality environment can generated by an extended reality system that includes a head-mounted display, a set of sensors and a processor configured to enter a focus mode that reduces distraction in an extended reality environment. While in the focus mode, the process or can receive imaging data of a physical environment around a user using the set of sensors and generate the extended reality environment that includes a reproduction of the first region of the physical environment where an identified object is replaced with the additional content.
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Grant H. Mulliken, Anura A. Patil, Brian Pasley, Christine Godwin, Eve Ekman, Jonathan Hadida, Lauren Cheung, Mary A. Pyc, Patrick O. Eronini, Raphael A. Bernier, Steven A. Marchette, Fletcher Rothkopf
  • Publication number: 20240104838
    Abstract: Embodiments are directed to aspects of the extended reality environments that are selected or otherwise modified to account for distracting stimuli. Similarly, one or more metrics that reflect, or are otherwise indicative of, a user's ability to focus on a current or upcoming activity may be used to adjust a user's interaction with the extended reality environment. The extended reality environment can generated by an extended reality system that includes a head-mounted display, a set of sensors and a processor configured to enter a focus mode that reduces distraction in an extended reality environment. While in the focus mode, the process or can receive imaging data of a physical environment around a user using the set of sensors and generate the extended reality environment that includes a reproduction of the first region of the physical environment where an identified object is replaced with the additional content.
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Grant H. Mulliken, Anura A Patil, Brian Pasle, Christine Godwin, Eve Ekman, Jonathan Hadida, Lauren Cheung, Mary A. Pyc, Patrick O. Eronini, Raphael A. Bernier, Steven A Marchette, Fletcher Rothkopf
  • Publication number: 20240104792
    Abstract: Embodiments are directed to aspects of the extended reality environments that are selected or otherwise modified to account for distracting stimuli. Similarly, one or more metrics that reflect, or are otherwise indicative of, a user's ability to focus on a current or upcoming activity may be used to adjust a user's interaction with the extended reality environment. The extended reality environment can generated by an extended reality system that includes a head-mounted display, a set of sensors and a processor configured to enter a focus mode that reduces distraction in an extended reality environment. While in the focus mode, the process or can receive imaging data of a physical environment around a user using the set of sensors and generate the extended reality environment that includes a reproduction of the first region of the physical environment where an identified object is replaced with the additional content.
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Grant H. Mulliken, Anura A. Patil, Brian Pasley, Christine Godwin, Eve Ekman, Jonathan Hadida, Lauren Cheung, Mary A. Pyc, Patrick O. Eronini, Raphael A. Bernier, Steven A. Marchette, Fletcher Rothkopf
  • Publication number: 20230418378
    Abstract: Some implementations disclosed herein present a computer-generated reality (CGR) environment in which a user participates in an activity, identify a cognitive state of the user (e.g., working, learning, resting, etc.) based on data regarding the user's body (e.g., facial expressions, hand movements, physiological data, etc.), and update the environment with a surrounding environment that promotes a cognitive state of the user for the activity.
    Type: Application
    Filed: September 7, 2023
    Publication date: December 28, 2023
    Inventors: Izzet B. Yildiz, Mary A. Pyc, Ronald J. Guglielmone, JR., Sterling R. Crispin, Mikaela D. Estep, Grant H. Mulliken
  • Patent number: 11782508
    Abstract: Some implementations disclosed herein present a computer-generated reality (CGR) environment in which a user participates in an activity, identify a cognitive state of the user (e.g., working, learning, resting, etc.) based on data regarding the user's body (e.g., facial expressions, hand movements, physiological data, etc.), and update the environment with a surrounding environment that promotes a cognitive state of the user for the activity.
    Type: Grant
    Filed: September 24, 2020
    Date of Patent: October 10, 2023
    Assignee: APPLE INC.
    Inventors: Izzet B. Yildiz, Mary A. Pyc, Ronald J. Guglielmone, Jr., Sterling R. Crispin, Mikaela D. Estep, Grant H. Mulliken
  • Publication number: 20210096646
    Abstract: Some implementations disclosed herein present a computer-generated reality (CGR) environment in which a user participates in an activity, identify a cognitive state of the user (e.g., working, learning, resting, etc.) based on data regarding the user's body (e.g., facial expressions, hand movements, physiological data, etc.), and update the environment with a surrounding environment that promotes a cognitive state of the user for the activity.
    Type: Application
    Filed: September 24, 2020
    Publication date: April 1, 2021
    Inventors: Izzet B. Yildiz, Mary A. Pyc, Ronald J. Guglielmone, JR., Sterling R. Crispin, Mikaela D. Estep, Grant H. Mulliken