Patents by Inventor Elizabeth Juenger
Elizabeth Juenger has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250121290Abstract: A method includes monitoring in a gaming session game play of a first player playing a game on a device of a first platform, and monitoring game play of a second player playing the game on a device of a second platform. A first valuation of a performance metric measuring gaming effectiveness is determined for the game play by the first player. A second valuation of the performance metric is determined for the game play by the second player. A difference between the first valuation and the second valuation does not satisfy a threshold band of operation that is based on global skill levels of the first and second players. The game play of the first player is augmented to modify the first valuation so that the threshold band of operation is satisfied in order to normalize effectiveness of playing the video game between the first and second players.Type: ApplicationFiled: October 12, 2023Publication date: April 17, 2025Inventors: Lee Kirsten Gould, Elizabeth Juenger
-
Patent number: 12274940Abstract: A method and system for providing gaze-based generation of virtual effects indicators correlated with directional sounds is disclosed. Gaze data is tracked via a camera associated with a client device to identify a point of focus within a three-dimensional virtual environment towards which one or both eyes of the player are focused. When the point of focus indicated by the gaze data when the point of focus does not move towards the source location within the three-dimensional virtual environment when the directional sound is received indicates that a virtual effect indicator associated with the directional sound type of the indicated directional sound is should be generated.Type: GrantFiled: April 17, 2024Date of Patent: April 15, 2025Assignees: Sony Interactive Entertainment LLC, Sony Interactive Entertainment Inc.Inventors: Kristie Ramirez, Elizabeth Juenger, Katie Egeland, Sepideh Karimi, Lachmin Singh, Olga Rudi
-
Publication number: 20250058229Abstract: A method and system for assisting a play is disclosed. A gameplay of a user of media content that is associated with a game type is received. A skill involved in the gameplay of the user is determined based on the gameplay of the user and the game type. A skill level of the user is determined based on the gameplay of the user. A training curriculum is provided to the user based on the skill and the skill level of the user upon detecting a triggering event.Type: ApplicationFiled: October 31, 2024Publication date: February 20, 2025Inventors: Ellana Fortuna, Elizabeth Juenger, Sarah Karp, Jennifer Sheldon
-
Publication number: 20250058222Abstract: Methods and systems for cooperative or coached gameplay in virtual environments are disclosed. Memory may store a content control profile regarding a set of control input associated with an action in a virtual environment of a digital content title. A request may be received from a set of one or more users associated with different source devices regarding cooperative gameplay of the digital content title. At least one virtual avatar may be generated for an interactive session of the digital content title in response to the request. A plurality of control inputs may be received from the plurality of different source devices and combined into a combination set of control inputs. Generating the combination set of control input may be based on the content control profile. Virtual actions associated with the virtual avatar may be controlled within the virtual environment in accordance with the combination set of control inputs.Type: ApplicationFiled: October 31, 2024Publication date: February 20, 2025Inventors: Olga Rudi, Kristine Young, Mahdi Azmandian, Jin Zhang, Sarah Karp, Sepideh Karimi, Elizabeth Juenger
-
Publication number: 20250056091Abstract: Systems and methods for automated visual trigger profiling and detection within virtual environments are provided. A visual trigger profile may be stored in memory that includes a set of visual trigger characteristics associated with a type of visual sensitivity. Buffered frames of an audiovisual stream that have not yet been displayed may be monitored to identify when a buffered frame includes a threshold level of the visual trigger characteristics associated with the visual sensitivity. A frame modification that decreases the level of the detected visual trigger characteristics associated with the visual sensitivity may be identified and applied to the identified frames. The modified frames may thereafter be presented during the audiovisual stream in place of the original (unmodified) identified frames.Type: ApplicationFiled: October 24, 2024Publication date: February 13, 2025Inventors: Celeste Bean, Kristie Ramirez, Elizabeth Juenger, Steven Osman, Olga Rudi
-
Publication number: 20250041720Abstract: Systems and methods for dynamic content display generation for multiple device setups are provided. Supplemental content files associated with an interactive content title may be stored in memory in association with parameters specific to different types of user devices. A current interactive session associated with a first device may be monitored and determined to include a virtual environment presentation associated with the interactive content title. A type of a secondary device associated with the user may be identified. A supplemental content package may be generated for the secondary device based on a selected set of supplemental content and a set of the parameters associated with the type of the secondary device. The supplemental content package may be provided to the secondary device during the current interactive session, and the secondary device may generate a presentation of the selected set of supplemental content files in accordance with the set of parameters.Type: ApplicationFiled: August 3, 2023Publication date: February 6, 2025Inventors: Ellana Fortuna, Kristie Ramirez, Elizabeth Juenger, Victoria Dorn, Celeste Bean, Bethany Tinklenberg, Mahdi Azmandian, Sarah Karp, Ritagya Meharishi, Xi Zhou
-
Publication number: 20250036102Abstract: A method for generating a physical object is provided, including: capturing two-dimensional (2D) gameplay video generated from a session of a video game; analyzing the 2D gameplay video to identify a virtual object depicted in the 2D gameplay video; further analyzing the 2D gameplay video to determine 3D geometry of the virtual object; using the 3D geometry of the object to generate a 3D model of the virtual object; storing the 3D model to a user account; using the 3D model to generate a physical object resembling the virtual object.Type: ApplicationFiled: July 28, 2023Publication date: January 30, 2025Inventors: Ellana Fortuna, Elizabeth Juenger, Kristie Ramirez, Geoff Norton
-
Publication number: 20250037362Abstract: A method for generating a three-dimensional (3D) content moment from a video game is provided, including the following operations: capturing two-dimensional (2D) gameplay video generated from a session of a video game; analyzing the 2D gameplay video to determine 3D geometry of a scene depicted in the 2D gameplay video; using the 3D geometry of the scene to generate a 3D still asset of a moment that occurred in the gameplay video; storing the 3D still asset to a user account.Type: ApplicationFiled: July 28, 2023Publication date: January 30, 2025Inventors: Ellana Fortuna, Elizabeth Juenger, Kristie Ramirez, Geoff Norton
-
Publication number: 20250032914Abstract: A method for generating a three-dimensional (3D) content moment from a video game is provided, including the following operations: capturing two-dimensional (2D) gameplay video generated from a session of a video game; analyzing the 2D gameplay video to determine 3D geometry of a scene depicted in the 2D gameplay video; using the 3D geometry of the scene to generate a 3D video asset of a moment that occurred in the gameplay video; storing the 3D video asset to a user account.Type: ApplicationFiled: July 28, 2023Publication date: January 30, 2025Inventors: Ellana Fortuna, Elizabeth Juenger, Kristie Ramirez, Geoff Norton
-
Publication number: 20250032915Abstract: A method for generating a view of an event in a video game is provided, including the following operations: capturing two-dimensional (2D) gameplay video generated from a session of a video game; analyzing the 2D gameplay video to identify an event occurring in a scene depicted in the 2D gameplay video and identifying one or more elements involved in said event; further analyzing the 2D gameplay video to determine 3D geometry of the scene; using the 3D geometry of the scene to generate a 3D video asset of the event that occurred in the gameplay video; generating a 2D view of the 3D video asset for presentation on a display, wherein generating said 2D view includes determining a field of view (FOV) to apply for the 2D view, the FOV being configured to include the elements involved in the event.Type: ApplicationFiled: July 28, 2023Publication date: January 30, 2025Inventors: Ellana Fortuna, Elizabeth Juenger, Kristie Ramirez, Geoff Norton
-
Patent number: 12157062Abstract: A method and system for assisting a play is disclosed. A gameplay of a user of media content that is associated with a game type is received. A skill involved in the gameplay of the user is determined based on the gameplay of the user and the game type. A skill level of the user is determined based on the gameplay of the user. A training curriculum is provided to the user based on the skill and the skill level of the user upon detecting a triggering event.Type: GrantFiled: December 13, 2022Date of Patent: December 3, 2024Assignee: SONY INTERACTIVE ENTERTAINMENT LLCInventors: Ellana Fortuna, Elizabeth Juenger, Sarah Karp, Jennifer Sheldon
-
Patent number: 12160634Abstract: Systems and methods for automated visual trigger profiling and detection within virtual environments are provided. A visual trigger profile may be stored in memory that includes a set of visual trigger characteristics associated with a type of visual sensitivity. Buffered frames of an audiovisual stream that have not yet been displayed may be monitored to identify when a buffered frame includes a threshold level of the visual trigger characteristics associated with the visual sensitivity. A frame modification that decreases the level of the detected visual trigger characteristics associated with the visual sensitivity may be identified and applied to the identified frames. The modified frames may thereafter be presented during the audiovisual stream in place of the original (unmodified) identified frames.Type: GrantFiled: May 31, 2022Date of Patent: December 3, 2024Assignees: SONY INTERACTIVE ENTERTAINMENT LLC, SONY INTERACTIVE ENTERTAINMENT INC.Inventors: Celeste Bean, Kristie Ramirez, Elizabeth Juenger, Steve Osman, Olga Rudi
-
Patent number: 12151167Abstract: Methods of the present disclosure may collect data when a user plays one or more different types of games when determinations are made as to whether the difficulty of a game should be changed. The collected data maybe evaluated to identify whether a user gaming performance level corresponds to an expected level of performance. When the user gaming performance level does not correspond to an expected level of performance, parameters that change the difficultly of the game may be changed automatically. Parameters that relate to movement speed, delay or hesitation, character strengths, numbers of competitors, or other metrics may be changed incrementally until a current user performance level corresponds to an expectation level of a particular user currently playing the game. At this time, the user expectation level may be changed, and the process may be repeated as skills of the user are developed over time.Type: GrantFiled: May 31, 2022Date of Patent: November 26, 2024Assignees: SONY INTERACTIVE ENTERTAINMENT LLC, SONY INTERACTIVE ENTERTAINMENT INC.Inventors: Victoria Dorn, Celeste Bean, Elizabeth Juenger, Kristie Ramirez, Sepideh Karimi, Mahdi Azmandian
-
Patent number: 12145065Abstract: Methods and systems for cooperative or coached gameplay in virtual environments are disclosed. Memory may store a content control profile regarding a set of control input associated with an action in a virtual environment of a digital content title. A request may be received from a set of one or more users associated with different source devices regarding cooperative gameplay of the digital content title. At least one virtual avatar may be generated for an interactive session of the digital content title in response to the request. A plurality of control inputs may be received from the plurality of different source devices and combined into a combination set of control inputs. Generating the combination set of control input may be based on the content control profile. Virtual actions associated with the virtual avatar may be controlled within the virtual environment in accordance with the combination set of control inputs.Type: GrantFiled: May 31, 2022Date of Patent: November 19, 2024Assignees: SONY INTERACTIVE ENTERTAINMENT LLC, SONY INTERACTIVE ENTERTAINMENT INC.Inventors: Olga Rudi, Kristine Young, Mahdi Azmandian, Jin Zhang, Sarah Karp, Sepideh Karimi, Elizabeth Juenger
-
Publication number: 20240269559Abstract: A method and system for providing gaze-based generation of virtual effects indicators correlated with directional sounds is disclosed. Gaze data is tracked via a camera associated with a client device to identify a point of focus within a three-dimensional virtual environment towards which one or both eyes of the player are focused. When the point of focus indicated by the gaze data when the point of focus does not move towards the source location within the three-dimensional virtual environment when the directional sound is received indicates that a virtual effect indicator associated with the directional sound type of the indicated directional sound is should be generated.Type: ApplicationFiled: April 17, 2024Publication date: August 15, 2024Inventors: Kristie Ramirez, Elizabeth Juenger, Katie Egeland, Sepideh Karimi, Lachmin Singh, Olga Rudi
-
Patent number: 12059621Abstract: Systems and methods for determining excessive motions or strained positions based on inputs associated with gameplay of game titles. A game intervention server may evaluate, based on learning models, posture, and physical motions of players for repetitive, unbalanced, or excessive motions, as well as gameplay quality patterns, and compare to thresholds for identifying unhealthy conditions. The game intervention server may make recommendations regarding breaks, stretches, warm-up/cool-down, curbing extended periods of play, etc. Notifications may be overlaid on screen with option to pause play without exiting game session. In-game events and requirements may also be adjusted based on learned insights to avoid excessive movement or counteract unbalanced movement.Type: GrantFiled: March 1, 2023Date of Patent: August 13, 2024Assignee: SONY INTERACTIVE ENTERTAINMENT LLCInventors: Jennifer Sheldon, Elizabeth Juenger, Sarah Karp
-
Patent number: 11975264Abstract: A method and system for providing gaze-based generation of virtual effects indicators correlated with directional sounds is disclosed. Gaze data is tracked via a camera associated with a client device to identify a point of focus within a three-dimensional virtual environment towards which one or both eyes of the player are focused. When the point of focus indicated by the gaze data when the point of focus does not move towards the source location within the three-dimensional virtual environment when the directional sound is received indicates that a virtual effect indicator associated with the directional sound type of the indicated directional sound is should be generated.Type: GrantFiled: May 31, 2022Date of Patent: May 7, 2024Assignees: SONY INTERACTIVE ENTERTAINMENT LLC, SONY INTERACTIVE ENTERTAINMENT INC.Inventors: Kristie Ramirez, Elizabeth Juenger, Katie Egeland, Sepideh Karimi, Lachmin Singh, Olga Rudi
-
Publication number: 20230381651Abstract: A method and system for providing gaze-based generation of virtual effects indicators correlated with directional sounds is disclosed. Gaze data is tracked via a camera associated with a client device to identify a point of focus within a three-dimensional virtual environment towards which one or both eyes of the player are focused. When the point of focus indicated by the gaze data when the point of focus does not move towards the source location within the three-dimensional virtual environment when the directional sound is received indicates that a virtual effect indicator associated with the directional sound type of the indicated directional sound is should be generated.Type: ApplicationFiled: May 31, 2022Publication date: November 30, 2023Inventors: Kristie Ramirez, Elizabeth Juenger, Katie Egeland, Sepideh Karimi, Lachmin Singh, Olga Rudi
-
Publication number: 20230381662Abstract: Methods of the present disclosure may collect data when a user plays one or more different types of games when determinations are made as to whether the difficulty of a game should be changed. The collected data maybe evaluated to identify whether a user gaming performance level corresponds to an expected level of performance. When the user gaming performance level does not correspond to an expected level of performance, parameters that change the difficultly of the game may be changed automatically. Parameters that relate to movement speed, delay or hesitation, character strengths, numbers of competitors, or other metrics may be changed incrementally until a current user performance level corresponds to an expectation level of a particular user currently playing the game. At this time, the user expectation level may be changed, and the process may be repeated as skills of the user are developed over time.Type: ApplicationFiled: May 31, 2022Publication date: November 30, 2023Inventors: Victoria Dorn, Celeste Bean, Elizabeth Juenger, Kristie Ramirez, Sepideh Karimi, Mahdi Azmandian
-
Publication number: 20230381652Abstract: Methods and systems for cooperative or coached gameplay in virtual environments are disclosed. Memory may store a content control profile regarding a set of control input associated with an action in a virtual environment of a digital content title. A request may be received from a set of one or more users associated with different source devices regarding cooperative gameplay of the digital content title. At least one virtual avatar may be generated for an interactive session of the digital content title in response to the request. A plurality of control inputs may be received from the plurality of different source devices and combined into a combination set of control inputs. Generating the combination set of control input may be based on the content control profile. Virtual actions associated with the virtual avatar may be controlled within the virtual environment in accordance with the combination set of control inputs.Type: ApplicationFiled: May 31, 2022Publication date: November 30, 2023Inventors: Olga Rudi, Kristine Young, Mahdi Azmandian, Jin Zhang, Sarah Karp, Sepideh Karimi, Elizabeth Juenger