Patents by Inventor Clayton Mosher

Clayton Mosher has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12223107
    Abstract: Provided is a system for controlling digital cinematic content based on emotional state of characters. A focus on one or more computer-controlled characters appearing in digital cinematic content is determined based on emotion indicators of a first user actively interacting with at least the one or more computer-controlled characters. A set of emotion indicators is inferred for each of the one or more computer-controlled characters based on one or more criteria and multifactor feedback loops are created. A story line of the digital cinematic content and behavioural characteristics of the one or more computer-controlled characters are controlled to achieve a target emotional arc of the first user based on the multifactor feedback loops.
    Type: Grant
    Filed: November 1, 2023
    Date of Patent: February 11, 2025
    Assignee: Warner Bros. Entertainment Inc.
    Inventors: Arvel Chappell, III, Michael Zink, Ha Nguyen, Clayton Mosher, Andrew Veeder, Michael Martinez, Shanshan Sun, Gary Lake-Schaal
  • Publication number: 20240134454
    Abstract: Provided is a system for controlling digital cinematic content based on emotional state of characters. A focus on one or more computer-controlled characters appearing in digital cinematic content is determined based on emotion indicators of a first user actively interacting with at least the one or more computer-controlled characters. A set of emotion indicators is inferred for each of the one or more computer-controlled characters based on one or more criteria and multifactor feedback loops are created. A story line of the digital cinematic content and behavioural characteristics of the one or more computer-controlled characters are controlled to achieve a target emotional arc of the first user based on the multifactor feedback loops.
    Type: Application
    Filed: November 1, 2023
    Publication date: April 25, 2024
    Inventors: Arvel CHAPPELL, III, Michael ZINK, Ha NGUYEN, Clayton MOSHER, Andrew VEEDER, Michael MARTINEZ, Shanshan SUN, Gary LAKE-SCHAAL
  • Patent number: 11822719
    Abstract: Provided is a system for controlling digital cinematic content based on emotional state of characters. A focus on one or more computer-controlled characters appearing in digital cinematic content is determined based on emotion indicators of a first user actively interacting with at least the one or more computer-controlled characters. A set of emotion indicators is inferred for each of the one or more computer-controlled characters based on one or more criteria and multifactor feedback loops are created. A story line of the digital cinematic content and behavioural characteristics of the one or more computer-controlled characters are controlled to achieve a target emotional arc of the first user based on the multifactor feedback loops.
    Type: Grant
    Filed: October 11, 2022
    Date of Patent: November 21, 2023
    Assignee: WARNER BROS. ENTERTAINMENT INC.
    Inventors: Arvel Chappell, III, Michael Zink, Ha Nguyen, Clayton Mosher, Andrew Veeder, Michael Martinez, Shanshan Sun, Gary Lake-Schaal
  • Publication number: 20230047787
    Abstract: Provided is a system for controlling progress of audio-video content based on sensor data of multiple users, composite neuro-physiological state (CNS) and/or content engagement power (CEP). Sensor data is received from sensors positioned on an electronic device of a first user to sense neuro-physiological responses of the first user and second users that are in field-of-view (FOV) of the sensors. Based on the sensor data and at least one of a CNS value for social interaction application and a CEP value for immersive content, recommendations of action items for first user are predicted. Content of a feedback loop, created based on sensor data, CNS value, CEP value, and predicted recommendations, is rendered on output unit of electronic device during play of the at least one of social interaction application and immersive content experience. Progress of social interaction and immersive content experience is controlled by first user based on predicted recommendations.
    Type: Application
    Filed: October 11, 2022
    Publication date: February 16, 2023
    Inventors: Arvel Chappell, III, Michael Zink, Ha Nguyen, Clayton Mosher, Andrew Veeder, Michael Martinez, Shanshan Sun, Gary Lake-Schaal