Patents by Inventor Christine GODWIN

Christine GODWIN has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240115831
    Abstract: Various implementations disclosed herein include devices, systems, and methods that provide customized feedback content during a meditation experience. For example, an example process may include obtaining physiological data via one or more sensors, determining that an attentive state based on the physiological data, customizing feedback content based on a user attribute to change the attentive state during the meditation mode, and providing the customized feedback content during the meditation mode after a delay time based on the user attribute.
    Type: Application
    Filed: December 20, 2023
    Publication date: April 11, 2024
    Inventors: Christine E. Welch, Alastair K. Fettes, Amy E. DeDonato, Christine Godwin, Dorian D. Dargan, Eric Landreneau, Gary I. Butcher, Grant H. Mulliken, Hugh A. Sider, Izzet B. Yildiz, James D. Dusseau, Jean-Francois St-Amour, Joanne Lee, Lucie Belanger, Michael B. Tucker, Philipp Rockel, Theodore Nestor Panagiotopoulos
  • Publication number: 20240104792
    Abstract: Embodiments are directed to aspects of the extended reality environments that are selected or otherwise modified to account for distracting stimuli. Similarly, one or more metrics that reflect, or are otherwise indicative of, a user's ability to focus on a current or upcoming activity may be used to adjust a user's interaction with the extended reality environment. The extended reality environment can generated by an extended reality system that includes a head-mounted display, a set of sensors and a processor configured to enter a focus mode that reduces distraction in an extended reality environment. While in the focus mode, the process or can receive imaging data of a physical environment around a user using the set of sensors and generate the extended reality environment that includes a reproduction of the first region of the physical environment where an identified object is replaced with the additional content.
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Grant H. Mulliken, Anura A. Patil, Brian Pasley, Christine Godwin, Eve Ekman, Jonathan Hadida, Lauren Cheung, Mary A. Pyc, Patrick O. Eronini, Raphael A. Bernier, Steven A. Marchette, Fletcher Rothkopf
  • Publication number: 20240104838
    Abstract: Embodiments are directed to aspects of the extended reality environments that are selected or otherwise modified to account for distracting stimuli. Similarly, one or more metrics that reflect, or are otherwise indicative of, a user's ability to focus on a current or upcoming activity may be used to adjust a user's interaction with the extended reality environment. The extended reality environment can generated by an extended reality system that includes a head-mounted display, a set of sensors and a processor configured to enter a focus mode that reduces distraction in an extended reality environment. While in the focus mode, the process or can receive imaging data of a physical environment around a user using the set of sensors and generate the extended reality environment that includes a reproduction of the first region of the physical environment where an identified object is replaced with the additional content.
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Grant H. Mulliken, Anura A Patil, Brian Pasle, Christine Godwin, Eve Ekman, Jonathan Hadida, Lauren Cheung, Mary A. Pyc, Patrick O. Eronini, Raphael A. Bernier, Steven A Marchette, Fletcher Rothkopf
  • Publication number: 20240104864
    Abstract: Embodiments are directed to aspects of the extended reality environments that are selected or otherwise modified to account for distracting stimuli. Similarly, one or more metrics that reflect, or are otherwise indicative of, a user's ability to focus on a current or upcoming activity may be used to adjust a user's interaction with the extended reality environment. The extended reality environment can generated by an extended reality system that includes a head-mounted display, a set of sensors and a processor configured to enter a focus mode that reduces distraction in an extended reality environment. While in the focus mode, the process or can receive imaging data of a physical environment around a user using the set of sensors and generate the extended reality environment that includes a reproduction of the first region of the physical environment where an identified object is replaced with the additional content.
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Grant H. Mulliken, Anura A. Patil, Brian Pasley, Christine Godwin, Eve Ekman, Jonathan Hadida, Lauren Cheung, Mary A. Pyc, Patrick O. Eronini, Raphael A. Bernier, Steven A. Marchette, Fletcher Rothkopf
  • Publication number: 20230282080
    Abstract: Various implementations disclosed herein include devices, systems, and methods that determine an attentive state of a user based on physiological data associated with a physiological response of the user to an auditory stimulus. For example, an example process may include selecting an auditory stimulus based on a characteristic of an environment, presenting the auditory stimulus to a user, obtaining, using a sensor, first physiological data associated with a physiological response of the user to the auditory stimulus, and assessing an attentive state of the user based on the physiological response of the user to the auditory stimulus.
    Type: Application
    Filed: November 30, 2022
    Publication date: September 7, 2023
    Inventors: Grant H. MULLIKEN, Christine GODWIN, Izzet B. YILDIZ, Sterling R. CRISPIN
  • Publication number: 20230259203
    Abstract: Various implementations disclosed herein include devices, systems, and methods that determine an attentive state of a user during an experience (e.g., visual and/or auditory content that could include real-world physical environment, virtual content, or a combination of each) based on the user's gaze characteristic(s) to enhance the experience. For example, an example process may include obtaining physiological data associated with a gaze of a user during an experience, determine a gaze characteristic during a segment of the experience based on the obtained physiological data, and determine that the user has a first attentive state during the segment of the experience based on classifying the gaze characteristic of the user during the segment of the experience.
    Type: Application
    Filed: December 1, 2022
    Publication date: August 17, 2023
    Inventors: Grant H. MULLIKEN, Christine GODWIN, Izzet B. YILDIZ, Sterling R. CRISPIN