Patents by Inventor Gary Lake-Schaal
Gary Lake-Schaal has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240134454Abstract: Provided is a system for controlling digital cinematic content based on emotional state of characters. A focus on one or more computer-controlled characters appearing in digital cinematic content is determined based on emotion indicators of a first user actively interacting with at least the one or more computer-controlled characters. A set of emotion indicators is inferred for each of the one or more computer-controlled characters based on one or more criteria and multifactor feedback loops are created. A story line of the digital cinematic content and behavioural characteristics of the one or more computer-controlled characters are controlled to achieve a target emotional arc of the first user based on the multifactor feedback loops.Type: ApplicationFiled: November 1, 2023Publication date: April 25, 2024Inventors: Arvel CHAPPELL, III, Michael ZINK, Ha NGUYEN, Clayton MOSHER, Andrew VEEDER, Michael MARTINEZ, Shanshan SUN, Gary LAKE-SCHAAL
-
Patent number: 11964204Abstract: A processor provides a simulated three-dimensional (3D) environment for a game or virtual reality (VR) experience, including controlling a characteristic parameter of a 3D object or character based on at least one of: an asynchronous event in a second game, feedback from multiple synchronous users of the VR experience, or on a function driven by one or variables reflecting a current state of at least one of the 3D environment, the game or the VR experience.Type: GrantFiled: March 1, 2021Date of Patent: April 23, 2024Assignee: Warner Bros. Entertainment Inc.Inventors: Greg Gewickey, Gary Lake-Schaal, Piotr Mintus, Lewis S. Ostrover, Michael Smith
-
Patent number: 11911693Abstract: A computer-implemented method for providing electronics games for play by a group of users in two or more moving vehicles. The method includes maintaining data structures of media program data, user profile data and vehicle profile data, receiving user and vehicle state information, identifying a group of users based on contemporaneous presence in two or more vehicles or common participation in a game or other group experience for related trips at different times, and selecting, configuring or creating media program for play at media players. An apparatus or system is configured to perform the method, and related operations.Type: GrantFiled: February 21, 2020Date of Patent: February 27, 2024Assignee: WARNER BROS. ENTERTAINMENT INC.Inventors: Gary Lake-Schaal, Pamela J. Allison, Lewis S. Ostrover, Gregory I. Gewickey, Ha Nguyen, Prashant Abhyankar
-
Patent number: 11826653Abstract: A game modification engine modifies configuration settings affecting game play and the user experience in computer games after initial publication of the game, based on device level and game play data associated with a user or cohort of users and on machine-learned relationships between input data and a use metric for the game. The modification is selected to improve performance of the game as measured by the use metric. The modification may be tailored for a user cohort. The game modification engine may define the cohort automatically based on correlations discovered in the input data relative to a defined use metric.Type: GrantFiled: April 10, 2020Date of Patent: November 28, 2023Assignee: WARNER BROS. ENTERTAINMENT INC.Inventors: Gary Lake-Schaal, Lewis S. Ostrover, Matthew Huard, Adam Husein
-
Patent number: 11822719Abstract: Provided is a system for controlling digital cinematic content based on emotional state of characters. A focus on one or more computer-controlled characters appearing in digital cinematic content is determined based on emotion indicators of a first user actively interacting with at least the one or more computer-controlled characters. A set of emotion indicators is inferred for each of the one or more computer-controlled characters based on one or more criteria and multifactor feedback loops are created. A story line of the digital cinematic content and behavioural characteristics of the one or more computer-controlled characters are controlled to achieve a target emotional arc of the first user based on the multifactor feedback loops.Type: GrantFiled: October 11, 2022Date of Patent: November 21, 2023Assignee: WARNER BROS. ENTERTAINMENT INC.Inventors: Arvel Chappell, III, Michael Zink, Ha Nguyen, Clayton Mosher, Andrew Veeder, Michael Martinez, Shanshan Sun, Gary Lake-Schaal
-
Publication number: 20230047787Abstract: Provided is a system for controlling progress of audio-video content based on sensor data of multiple users, composite neuro-physiological state (CNS) and/or content engagement power (CEP). Sensor data is received from sensors positioned on an electronic device of a first user to sense neuro-physiological responses of the first user and second users that are in field-of-view (FOV) of the sensors. Based on the sensor data and at least one of a CNS value for social interaction application and a CEP value for immersive content, recommendations of action items for first user are predicted. Content of a feedback loop, created based on sensor data, CNS value, CEP value, and predicted recommendations, is rendered on output unit of electronic device during play of the at least one of social interaction application and immersive content experience. Progress of social interaction and immersive content experience is controlled by first user based on predicted recommendations.Type: ApplicationFiled: October 11, 2022Publication date: February 16, 2023Inventors: Arvel Chappell, III, Michael Zink, Ha Nguyen, Clayton Mosher, Andrew Veeder, Michael Martinez, Shanshan Sun, Gary Lake-Schaal
-
Publication number: 20220347567Abstract: A computer-implemented method for providing electronics games for play by a group of users in two or more moving vehicles. The method includes maintaining data structures of media program data, user profile data and vehicle profile data, receiving user and vehicle state information, identifying a group of users based on contemporaneous presence in two or more vehicles or common participation in a game or other group experience for related trips at different times, and selecting, configuring or creating media program for play at media players. An apparatus or system is configured to perform the method, and related operations.Type: ApplicationFiled: February 21, 2020Publication date: November 3, 2022Applicant: WARNER BROS. ENTERTAINMENT INC.Inventors: Gary Lake-Schaal, Pamela J. Allison, Lewis S. Ostrover, Gregory I. Gewickey, Ha Nguyen, Prashant Abhyankar
-
Publication number: 20220303607Abstract: Systems and computer-implemented methods are disclosed for providing social entertainment experiences in a moving vehicle via an apparatus that simulates human social behavior relevant to a journey undertaken by the vehicle, for displaying human-perceivable exterior communication on the moving vehicle to neighboring vehicles and/or pedestrians, and for providing a modular travel experience.Type: ApplicationFiled: April 4, 2022Publication date: September 22, 2022Applicant: WARNER BROS. ENTERTAINMENT INC.Inventors: Gregory I. Gewickey, Ha Nguyen, Rana Brahma, Prashant Abhyankar, Nicole Schoepf, Genevieve Morris, Michael Zink, Michael Smith, Gary Lake-Schaal
-
Publication number: 20210252397Abstract: A processor provides a simulated three-dimensional (3D) environment for a game or virtual reality (VR) experience, including controlling a characteristic parameter of a 3D object or character based on at least one of: an asynchronous event in a second game, feedback from multiple synchronous users of the VR experience, or on a function driven by one or variables reflecting a current state of at least one of the 3D environment, the game or the VR experience.Type: ApplicationFiled: March 1, 2021Publication date: August 19, 2021Inventors: Gregory I. Gewickey, Gary Lake-Schaal, Piotr Mintus, Lewis S. Ostrover, Michael Smith
-
Patent number: 10933324Abstract: A sensor coupled to an AR/VR headset detects an eye convergence distance. A processor adjusts a focus distance for a virtual camera that determines rendering of a three-dimensional (3D) object for a display device of the headset, based on at least one of the eye convergence distance or a directed focus of attention for the at least one of the VR content or the AR content.Type: GrantFiled: March 1, 2018Date of Patent: March 2, 2021Assignee: Warner Bros. Entertainment Inc.Inventors: Greg Gewickey, Gary Lake-Schaal, Piotr Mintus, Lewis Ostrover, Michael Smith
-
Publication number: 20200405212Abstract: Applications for a Composite Neuro-physiological State (CNS) value include rating using the value as in indicator of participant emotional state in computer games and other social interaction applications. The CNS is computed based on biometric sensor data processed to express player engagement with content, game play, and other participants along multiple dimensions such as valence, arousal, and dominance. An apparatus is configured to perform the method using hardware, firmware, and/or software.Type: ApplicationFiled: July 7, 2020Publication date: December 31, 2020Inventors: Arvel A. Chappell, III, Lewis S. Ostrover, Gary Lake-Schaal
-
Publication number: 20200384367Abstract: A game modification engine modifies configuration settings affecting game play and the user experience in computer games after initial publication of the game, based on device level and game play data associated with a user or cohort of users and on machine-learned relationships between input data and a use metric for the game. The modification is selected to improve performance of the game as measured by the use metric. The modification may be tailored for a user cohort. The game modification engine may define the cohort automatically based on correlations discovered in the input data relative to a defined use metric.Type: ApplicationFiled: April 10, 2020Publication date: December 10, 2020Inventors: Gary Lake-Schaal, Lewis S. Ostrover, Matthew Huard, Adam Husein
-
Patent number: 10657727Abstract: An augmented reality (AR) output device or virtual reality (VR) output device is worn by a user and includes one or more sensors positioned to detect actions performed by a user of the immersive output device. A processor provides a data signal configured for the AR or VR output device, causing the immersive output device to provide AR output or VR output via a stereographic display device. The data signal encodes audio-video data. The processor controls a pace of scripted events defined by a narrative in the one of the AR output or the VR output, based on output from the one or more sensors indicating actions performed by a user of the AR or VR output device. The audio-video data may be packaged in a non-transitory computer-readable medium with additional content that is coordinated with the defined narrative and is configured for providing an alternative output, such as 2D video output or the stereoscopic 3D output.Type: GrantFiled: February 15, 2019Date of Patent: May 19, 2020Assignee: WARNER BROS. ENTERTAINMENT INC.Inventors: Christopher DeFaria, Piotr Mintus, Gary Lake-Schaal, Lewis Ostrover
-
Publication number: 20190251751Abstract: An augmented reality (AR) output device or virtual reality (VR) output device is worn by a user and includes one or more sensors positioned to detect actions performed by a user of the immersive output device. A processor provides a data signal configured for the AR or VR output device, causing the immersive output device to provide AR output or VR output via a stereographic display device. The data signal encodes audio-video data. The processor controls a pace of scripted events defined by a narrative in the one of the AR output or the VR output, based on output from the one or more sensors indicating actions performed by a user of the AR or VR output device. The audio-video data may be packaged in a non-transitory computer-readable medium with additional content that is coordinated with the defined narrative and is configured for providing an alternative output, such as 2D video output or the stereoscopic 3D output.Type: ApplicationFiled: February 15, 2019Publication date: August 15, 2019Inventors: Christopher DeFaria, Piotr Mintus, Gary Lake-Schaal, Lewis Ostrover
-
Patent number: 10249091Abstract: An augmented reality (AR) output device or virtual reality (VR) output device is worn by a user, and includes one or more sensors positioned to detect actions performed by a user of the immersive output device. A processor provides a data signal configured for the AR or VR output device, causing the immersive output device to provide AR output or VR output via a stereographic display device. The data signal encodes audio-video data. The processor controls a pace of scripted events defined by a narrative in the one of the AR output or the VR output, based on output from the one or more sensors indicating actions performed by a user of the AR or VR output device. The audio-video data may be packaged in a non-transitory computer-readable medium with additional content that is coordinated with the defined narrative and is configured for providing an alternative output, such as 2D video output or the stereoscopic 3D output.Type: GrantFiled: October 7, 2016Date of Patent: April 2, 2019Assignee: WARNER BROS. ENTERTAINMENT INC.Inventors: Christopher DeFaria, Piotr Mintus, Gary Lake-Schaal, Lewis Ostrover
-
Patent number: 10213688Abstract: A processor provides a simulated three-dimensional (3D) environment for a game or virtual reality (VR) experience, including controlling a characteristic parameter of a 3D object or character based on at least one of: an asynchronous event in a second game, feedback from multiple synchronous users of the VR experience, or on a function driven by one or variables reflecting a current state of at least one of the 3D environment, the game or the VR experience. In another aspect, a sensor coupled to an AR/VR headset detects an eye convergence distance. A processor adjusts a focus distance for a virtual camera that determines rendering of a three-dimensional (3D) object for a display device of the headset, based on at least one of the eye convergence distance or a directed focus of attention for the at least one of the VR content or the AR content.Type: GrantFiled: September 1, 2016Date of Patent: February 26, 2019Assignee: WARNER BROS. ENTERTAINMENT, INC.Inventors: Greg Gewicke, Gary Lake-Schaal, Piotr Mintus, Lewis Ostrover, Michael Smith
-
Publication number: 20180256978Abstract: A sensor coupled to an AR/VR headset detects an eye convergence distance. A processor adjusts a focus distance for a virtual camera that determines rendering of a three-dimensional (3D) object for a display device of the headset, based on at least one of the eye convergence distance or a directed focus of attention for the at least one of the VR content or the AR content.Type: ApplicationFiled: March 1, 2018Publication date: September 13, 2018Inventors: Greg Gewickey, Gary Lake-Schaal, Piotr Mintus, Lewis Ostrover, Michael Smith
-
Publication number: 20170103576Abstract: An augmented reality (AR) output device or virtual reality (VR) output device is worn by a user, and includes one or more sensors positioned to detect actions performed by a user of the immersive output device. A processor provides a data signal configured for the AR or VR output device, causing the immersive output device to provide AR output or VR output via a stereographic display device. The data signal encodes audio-video data. The processor controls a pace of scripted events defined by a narrative in the one of the AR output or the VR output, based on output from the one or more sensors indicating actions performed by a user of the AR or VR output device. The audio-video data may be packaged in a non-transitory computer-readable medium with additional content that is coordinated with the defined narrative and is configured for providing an alternative output, such as 2D video output or the stereoscopic 3D output.Type: ApplicationFiled: October 7, 2016Publication date: April 13, 2017Inventors: Christopher DeFaria, Piotr Mintus, Gary Lake-Schaal, Lewis Ostrover
-
Publication number: 20170061704Abstract: A processor provides a simulated three-dimensional (3D) environment for a game or virtual reality (VR) experience, including controlling a characteristic parameter of a 3D object or character based on at least one of: an asynchronous event in a second game, feedback from multiple synchronous users of the VR experience, or on a function driven by one or variables reflecting a current state of at least one of the 3D environment, the game or the VR experience. In another aspect, a sensor coupled to an AR/VR headset detects an eye convergence distance. A processor adjusts a focus distance for a virtual camera that determines rendering of a three-dimensional (3D) object for a display device of the headset, based on at least one of the eye convergence distance or a directed focus of attention for the at least one of the VR content or the AR content.Type: ApplicationFiled: September 1, 2016Publication date: March 2, 2017Inventors: Greg Gewicke, Gary Lake-Schaal, Piotr Mintus, Lewis Ostrover, Michael Smith