Patents by Inventor Kristie Ramirez
Kristie Ramirez has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12377350Abstract: A system uses map data of a virtual environment and object data of an avatar object including a moveset to generate an action path for traversal of the virtual environment by the avatar object. The system applies machine learning model(s) to generate one or more pathways through the virtual environment, and decompose the one or more pathways into one or more pathway segments that can be evaluated for feasibility and other factors. The system can iteratively generate and evaluate a move sequence of the action path having one or more parameters for execution by the avatar object. The system can construct the action path by combination of a plurality of pathway segments to satisfy one or more constraints associated with traversal by the avatar object from a start location to a destination location within the virtual environment.Type: GrantFiled: August 7, 2023Date of Patent: August 5, 2025Assignee: Sony Interactive Entertainment Inc.Inventors: Alex Duplessie, Victoria Dorn, Celeste Bean, Ritagya Meharishi, Kristie Ramirez, Tooba Ahsen
-
Patent number: 12274940Abstract: A method and system for providing gaze-based generation of virtual effects indicators correlated with directional sounds is disclosed. Gaze data is tracked via a camera associated with a client device to identify a point of focus within a three-dimensional virtual environment towards which one or both eyes of the player are focused. When the point of focus indicated by the gaze data when the point of focus does not move towards the source location within the three-dimensional virtual environment when the directional sound is received indicates that a virtual effect indicator associated with the directional sound type of the indicated directional sound is should be generated.Type: GrantFiled: April 17, 2024Date of Patent: April 15, 2025Assignees: Sony Interactive Entertainment LLC, Sony Interactive Entertainment Inc.Inventors: Kristie Ramirez, Elizabeth Juenger, Katie Egeland, Sepideh Karimi, Lachmin Singh, Olga Rudi
-
Publication number: 20250114708Abstract: Systems and methods for testing a non-player character (NPC) for use in a video game are described. One of the methods includes generating parameters that define characteristics of the NPC. The characteristics include data that define visual features for the NPC, physical features for the NPC, and context features for a test video game for the NPC. The method further includes providing the parameters to an artificial intelligence (AI) model. The method includes activating an interactive version of the NPC in the test video game and introducing a quality assurance NPC into the test video game. The quality assurance NPC is programmed to interact with the interactive version of the NPC in the test video game and generate test metrics regarding the interaction. The method includes modifying the interactive version of the NPC responsive to the test metrics.Type: ApplicationFiled: October 5, 2023Publication date: April 10, 2025Inventors: Celeste Bean, Tooba Ahsen, Kristie Ramirez, Victoria Dorn
-
Publication number: 20250058227Abstract: Systems and methods for providing gameplay assistance are described. One of the methods includes monitoring gameplay of a user of a game. The monitoring occurs to identify interactive skills of gameplay by the user during a session of the game. The method further includes determining that the interactive skills of gameplay have fallen below a threshold level for progressing the game and initiating gameplay assistance responsive to the interactive skills falling below the threshold level. The gameplay assistance includes a blended bifurcation of user inputs to complete one or more interactive tasks of the game. The blended bifurcation of user inputs include an amount of assistance inputs that override selected ones of the user inputs. The amount of assistance inputs vary over time during the gameplay of the game to maintain the interactive skills of the gameplay above the threshold level of interactive skills for playing the game.Type: ApplicationFiled: August 15, 2023Publication date: February 20, 2025Inventors: Celeste Bean, Bethany Tinklenberg, Kristie Ramirez
-
Publication number: 20250056091Abstract: Systems and methods for automated visual trigger profiling and detection within virtual environments are provided. A visual trigger profile may be stored in memory that includes a set of visual trigger characteristics associated with a type of visual sensitivity. Buffered frames of an audiovisual stream that have not yet been displayed may be monitored to identify when a buffered frame includes a threshold level of the visual trigger characteristics associated with the visual sensitivity. A frame modification that decreases the level of the detected visual trigger characteristics associated with the visual sensitivity may be identified and applied to the identified frames. The modified frames may thereafter be presented during the audiovisual stream in place of the original (unmodified) identified frames.Type: ApplicationFiled: October 24, 2024Publication date: February 13, 2025Inventors: Celeste Bean, Kristie Ramirez, Elizabeth Juenger, Steven Osman, Olga Rudi
-
Publication number: 20250050217Abstract: A system uses map data of a virtual environment and object data of an avatar object including a moveset to generate an action path for traversal of the virtual environment by the avatar object. The system applies machine learning model(s) to generate one or more pathways through the virtual environment, and decompose the one or more pathways into one or more pathway segments that can be evaluated for feasibility and other factors. The system can iteratively generate and evaluate a move sequence of the action path having one or more parameters for execution by the avatar object. The system can construct the action path by combination of a plurality of pathway segments to satisfy one or more constraints associated with traversal by the avatar object from a start location to a destination location within the virtual environment.Type: ApplicationFiled: August 7, 2023Publication date: February 13, 2025Inventors: Alex Duplessie, Victoria Dorn, Celeste Bean, Ritagya Meharishi, Kristie Ramirez, Tooba Ahsen
-
Publication number: 20250041720Abstract: Systems and methods for dynamic content display generation for multiple device setups are provided. Supplemental content files associated with an interactive content title may be stored in memory in association with parameters specific to different types of user devices. A current interactive session associated with a first device may be monitored and determined to include a virtual environment presentation associated with the interactive content title. A type of a secondary device associated with the user may be identified. A supplemental content package may be generated for the secondary device based on a selected set of supplemental content and a set of the parameters associated with the type of the secondary device. The supplemental content package may be provided to the secondary device during the current interactive session, and the secondary device may generate a presentation of the selected set of supplemental content files in accordance with the set of parameters.Type: ApplicationFiled: August 3, 2023Publication date: February 6, 2025Inventors: Ellana Fortuna, Kristie Ramirez, Elizabeth Juenger, Victoria Dorn, Celeste Bean, Bethany Tinklenberg, Mahdi Azmandian, Sarah Karp, Ritagya Meharishi, Xi Zhou
-
Publication number: 20250032915Abstract: A method for generating a view of an event in a video game is provided, including the following operations: capturing two-dimensional (2D) gameplay video generated from a session of a video game; analyzing the 2D gameplay video to identify an event occurring in a scene depicted in the 2D gameplay video and identifying one or more elements involved in said event; further analyzing the 2D gameplay video to determine 3D geometry of the scene; using the 3D geometry of the scene to generate a 3D video asset of the event that occurred in the gameplay video; generating a 2D view of the 3D video asset for presentation on a display, wherein generating said 2D view includes determining a field of view (FOV) to apply for the 2D view, the FOV being configured to include the elements involved in the event.Type: ApplicationFiled: July 28, 2023Publication date: January 30, 2025Inventors: Ellana Fortuna, Elizabeth Juenger, Kristie Ramirez, Geoff Norton
-
Publication number: 20250036102Abstract: A method for generating a physical object is provided, including: capturing two-dimensional (2D) gameplay video generated from a session of a video game; analyzing the 2D gameplay video to identify a virtual object depicted in the 2D gameplay video; further analyzing the 2D gameplay video to determine 3D geometry of the virtual object; using the 3D geometry of the object to generate a 3D model of the virtual object; storing the 3D model to a user account; using the 3D model to generate a physical object resembling the virtual object.Type: ApplicationFiled: July 28, 2023Publication date: January 30, 2025Inventors: Ellana Fortuna, Elizabeth Juenger, Kristie Ramirez, Geoff Norton
-
Publication number: 20250032914Abstract: A method for generating a three-dimensional (3D) content moment from a video game is provided, including the following operations: capturing two-dimensional (2D) gameplay video generated from a session of a video game; analyzing the 2D gameplay video to determine 3D geometry of a scene depicted in the 2D gameplay video; using the 3D geometry of the scene to generate a 3D video asset of a moment that occurred in the gameplay video; storing the 3D video asset to a user account.Type: ApplicationFiled: July 28, 2023Publication date: January 30, 2025Inventors: Ellana Fortuna, Elizabeth Juenger, Kristie Ramirez, Geoff Norton
-
Publication number: 20250037362Abstract: A method for generating a three-dimensional (3D) content moment from a video game is provided, including the following operations: capturing two-dimensional (2D) gameplay video generated from a session of a video game; analyzing the 2D gameplay video to determine 3D geometry of a scene depicted in the 2D gameplay video; using the 3D geometry of the scene to generate a 3D still asset of a moment that occurred in the gameplay video; storing the 3D still asset to a user account.Type: ApplicationFiled: July 28, 2023Publication date: January 30, 2025Inventors: Ellana Fortuna, Elizabeth Juenger, Kristie Ramirez, Geoff Norton
-
Patent number: 12168175Abstract: A method for gaming including generating game state data during execution of a video game for a game play of a player. The method includes determining a gaming context of a current point in the game play based on the game state data. The method includes inputting the game state data and the gaming context to an artificial intelligence (AI) model trained to identify one or more levels of user immersion defining user engagement with the video game. The method includes using the AI model to determine a level of user immersion for gaming for the current point in the game play, wherein the level of immersion exceeds a threshold indicating that the player is highly engaged with the video game. The method includes automatically generating an indicator that is presented to an environment surrounding the player, wherein the indicator provides notification that the player should not be interrupted.Type: GrantFiled: May 27, 2022Date of Patent: December 17, 2024Assignee: Sony Interactive Entertainment LLCInventors: Sarah Karp, Kristie Ramirez
-
Patent number: 12160634Abstract: Systems and methods for automated visual trigger profiling and detection within virtual environments are provided. A visual trigger profile may be stored in memory that includes a set of visual trigger characteristics associated with a type of visual sensitivity. Buffered frames of an audiovisual stream that have not yet been displayed may be monitored to identify when a buffered frame includes a threshold level of the visual trigger characteristics associated with the visual sensitivity. A frame modification that decreases the level of the detected visual trigger characteristics associated with the visual sensitivity may be identified and applied to the identified frames. The modified frames may thereafter be presented during the audiovisual stream in place of the original (unmodified) identified frames.Type: GrantFiled: May 31, 2022Date of Patent: December 3, 2024Assignees: SONY INTERACTIVE ENTERTAINMENT LLC, SONY INTERACTIVE ENTERTAINMENT INC.Inventors: Celeste Bean, Kristie Ramirez, Elizabeth Juenger, Steve Osman, Olga Rudi
-
Patent number: 12151167Abstract: Methods of the present disclosure may collect data when a user plays one or more different types of games when determinations are made as to whether the difficulty of a game should be changed. The collected data maybe evaluated to identify whether a user gaming performance level corresponds to an expected level of performance. When the user gaming performance level does not correspond to an expected level of performance, parameters that change the difficultly of the game may be changed automatically. Parameters that relate to movement speed, delay or hesitation, character strengths, numbers of competitors, or other metrics may be changed incrementally until a current user performance level corresponds to an expectation level of a particular user currently playing the game. At this time, the user expectation level may be changed, and the process may be repeated as skills of the user are developed over time.Type: GrantFiled: May 31, 2022Date of Patent: November 26, 2024Assignees: SONY INTERACTIVE ENTERTAINMENT LLC, SONY INTERACTIVE ENTERTAINMENT INC.Inventors: Victoria Dorn, Celeste Bean, Elizabeth Juenger, Kristie Ramirez, Sepideh Karimi, Mahdi Azmandian
-
Publication number: 20240269559Abstract: A method and system for providing gaze-based generation of virtual effects indicators correlated with directional sounds is disclosed. Gaze data is tracked via a camera associated with a client device to identify a point of focus within a three-dimensional virtual environment towards which one or both eyes of the player are focused. When the point of focus indicated by the gaze data when the point of focus does not move towards the source location within the three-dimensional virtual environment when the directional sound is received indicates that a virtual effect indicator associated with the directional sound type of the indicated directional sound is should be generated.Type: ApplicationFiled: April 17, 2024Publication date: August 15, 2024Inventors: Kristie Ramirez, Elizabeth Juenger, Katie Egeland, Sepideh Karimi, Lachmin Singh, Olga Rudi
-
Patent number: 11975264Abstract: A method and system for providing gaze-based generation of virtual effects indicators correlated with directional sounds is disclosed. Gaze data is tracked via a camera associated with a client device to identify a point of focus within a three-dimensional virtual environment towards which one or both eyes of the player are focused. When the point of focus indicated by the gaze data when the point of focus does not move towards the source location within the three-dimensional virtual environment when the directional sound is received indicates that a virtual effect indicator associated with the directional sound type of the indicated directional sound is should be generated.Type: GrantFiled: May 31, 2022Date of Patent: May 7, 2024Assignees: SONY INTERACTIVE ENTERTAINMENT LLC, SONY INTERACTIVE ENTERTAINMENT INC.Inventors: Kristie Ramirez, Elizabeth Juenger, Katie Egeland, Sepideh Karimi, Lachmin Singh, Olga Rudi
-
Publication number: 20230388584Abstract: Systems and methods for automated visual trigger profiling and detection within virtual environments are provided. A visual trigger profile may be stored in memory that includes a set of visual trigger characteristics associated with a type of visual sensitivity. Buffered frames of an audiovisual stream that have not yet been displayed may be monitored to identify when a buffered frame includes a threshold level of the visual trigger characteristics associated with the visual sensitivity. A frame modification that decreases the level of the detected visual trigger characteristics associated with the visual sensitivity may be identified and applied to the identified frames. The modified frames may thereafter be presented during the audiovisual stream in place of the original (unmodified) identified frames.Type: ApplicationFiled: May 31, 2022Publication date: November 30, 2023Inventors: Celeste Bean, Kristie Ramirez, Elizabeth Juenger, Steve Osman, Olga Rudi
-
Publication number: 20230381651Abstract: A method and system for providing gaze-based generation of virtual effects indicators correlated with directional sounds is disclosed. Gaze data is tracked via a camera associated with a client device to identify a point of focus within a three-dimensional virtual environment towards which one or both eyes of the player are focused. When the point of focus indicated by the gaze data when the point of focus does not move towards the source location within the three-dimensional virtual environment when the directional sound is received indicates that a virtual effect indicator associated with the directional sound type of the indicated directional sound is should be generated.Type: ApplicationFiled: May 31, 2022Publication date: November 30, 2023Inventors: Kristie Ramirez, Elizabeth Juenger, Katie Egeland, Sepideh Karimi, Lachmin Singh, Olga Rudi
-
Publication number: 20230381673Abstract: The present disclosure generally relates to providing virtual education and education to a user. More specifically, the present system relates to educating and onboarding spectators of electronic sports (eSports) events. The onboarding activities are used to further engage the spectators with the eSports event in general, as well as the game played during the eSports event. In other aspects, the eSports onboarding activity may be modified based on the type of game being played, the user's experience with the specific game or game genre, and other user preferences.Type: ApplicationFiled: May 31, 2022Publication date: November 30, 2023Inventors: Mahdi Azmandian, Victoria Dorn, Sarah Karp, Sudha Krishnamurthy, Kristie Ramirez
-
Publication number: 20230381649Abstract: A method for gaming including generating game state data during execution of a video game for a game play of a player. The method includes determining a gaming context of a current point in the game play based on the game state data. The method includes inputting the game state data and the gaming context to an artificial intelligence (AI) model trained to identify one or more levels of user immersion defining user engagement with the video game. The method includes using the AI model to determine a level of user immersion for gaming for the current point in the game play, wherein the level of immersion exceeds a threshold indicating that the player is highly engaged with the video game. The method includes automatically generating an indicator that is presented to an environment surrounding the player, wherein the indicator provides notification that the player should not be interrupted.Type: ApplicationFiled: May 27, 2022Publication date: November 30, 2023Inventors: Sarah Karp, Kristie Ramirez