Patents by Inventor Matthaeus KRENN

Matthaeus KRENN has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12266061
    Abstract: Methods and systems described herein are directed to a virtual personal interface (herein “personal interface”) for controlling an artificial reality (XR) environment, such as by providing user interfaces for interactions with a current XR application, providing detail views for selected items, navigating between multiple virtual worlds without having to transition in and out of a home lobby for those worlds, executing aspects of a second XR application while within a world controlled by a first XR application, and providing 3D content that is separate from the current world. While in at least one of those worlds, the personal interface can itself present content in a runtime separate from the current virtual world, corresponding to an item, action, or application for that world. XR applications can be defined for use with the personal interface to create both a 3D world portion and 2D interface portions that are displayed via the personal interface.
    Type: Grant
    Filed: November 17, 2022
    Date of Patent: April 1, 2025
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Matthaeus Krenn, Jeremy Edelblut, John Nicholas Jitkoff
  • Publication number: 20250080830
    Abstract: A computer system displays virtual objects overlaid on a view of a physical environment as a virtual effect. The computer system displays respective animated movements of the virtual objects over the view of the physical environment, wherein the respective animated movements are constrained in accordance with a direction of simulated gravity associated with the view of the physical environment. If current positions of virtual objects during the respective animated movement of the virtual objects corresponds to different surfaces at different heights detected in the view of the physical environment, the computer constrains the respective animated movements of the virtual objects in accordance with the different surfaces detected in the view of the physical environment.
    Type: Application
    Filed: October 28, 2024
    Publication date: March 6, 2025
    Inventors: Joseph A. Malia, Mark K. Hauenstein, Praveen Sharma, Matan Stauber, Julian K. Missig, Jeffrey T. Bernstein, Lukas Robert Tom Girling, Matthaeus Krenn
  • Patent number: 12131417
    Abstract: A computer system displays virtual objects overlaid on a view of a physical environment as a virtual effect. The computer system displays respective animated movements of the virtual objects over the view of the physical environment, wherein the respective animated movements are constrained in accordance with a direction of simulated gravity associated with the view of the physical environment. If current positions of virtual objects during the respective animated movement of the virtual objects corresponds to different surfaces at different heights detected in the view of the physical environment, the computer constrains the respective animated movements of the virtual objects in accordance with the different surfaces detected in the view of the physical environment.
    Type: Grant
    Filed: November 8, 2023
    Date of Patent: October 29, 2024
    Assignee: APPLE INC.
    Inventors: Joseph A. Malia, Mark K. Hauenstein, Praveen Sharma, Matan Stauber, Julian K. Missig, Jeffrey T. Bernstein, Lukas Robert Tom Girling, Matthaeus Krenn
  • Publication number: 20240265656
    Abstract: A computer implemented method for facilitating system user interface (UI) interactions in an artificial reality (XR) environment is provided. The method includes rendering the system UI in the XR environment as a 3D virtual element. The method further includes tracking a position of a hand of a user and a pre-defined stable point on the user. The method further includes identifying, based on the tracking, that the hand has grasped a portion of the system UI and, in response, rotating the position of the system UI around the grasped portion of the system UI such that a line, between the stable point and the surface of the system UI, is moved, to be perpendicular or at a predefined angle from perpendicular to the surface of the system UI, as the user moves the system UI via the grasped portion.
    Type: Application
    Filed: February 8, 2023
    Publication date: August 8, 2024
    Inventors: Anastasia VICTOR-FAICHNEY, Matthew Alan INSLEY, Samuel Matthew LEVATICH, Ahad Habib BASRAVI, Matthew LUTHER, Andrew C. JOHNSON, Matthaeus KRENN, Difei WANG, Irvi STEFO, Norah Riley SMITH
  • Publication number: 20240264660
    Abstract: A computer implemented method for facilitating user interface interactions in an XR environment is provided. The method incudes rendering a system UI and tracking a position of user's hand. The method further includes signifying an interaction opportunity by generating first feedback that modifies a UI element based on position of user's hand being within first threshold distance of the UI element or by generating second feedback that accentuates an edge of the system UI based on the position of user's hand being within second threshold distance of the edge. Furthermore, the method includes updating the position of the user's hand. The method further includes signifying interaction with the UI element by modifying location of representation of the user's hand, when the user's hand has interacted with the UI element or signifying interaction with the edge by generating third feedback that accentuates the portion of the representation that grabs the edge.
    Type: Application
    Filed: February 8, 2023
    Publication date: August 8, 2024
    Inventors: Samuel Matthew LEVATICH, Matthew Alan INSLEY, Andrew C. JOHNSON, Qi XIONG, Jeremy EDELBLUT, Matthaeus KRENN, John Nicholas JITKOFF, Jennifer MORROW, Brandon FURTWANGLER
  • Publication number: 20240160337
    Abstract: Methods and systems described herein are directed to a virtual web browser for providing access to multiple virtual worlds interchangeably. Browser tabs for corresponding website and virtual world pairs can be displayed along with associated controls, the selection of such controls effecting the instantiation of 3D content for the virtual worlds. One or more of the tabs can be automatically generated as a result of interactions with objects in the virtual worlds, such that travel to a world, corresponding to an object to which an interaction was directed, is facilitated.
    Type: Application
    Filed: January 25, 2024
    Publication date: May 16, 2024
    Inventors: Jeremy EDELBLUT, Matthaeus KRENN, John Nicholas JITKOFF
  • Patent number: 11928314
    Abstract: Methods and systems described herein are directed to a virtual web browser for providing access to multiple virtual worlds interchangeably. Browser tabs for corresponding website and virtual world pairs can be displayed along with associated controls, the selection of such controls effecting the instantiation of 3D content for the virtual worlds. One or more of the tabs can be automatically generated as a result of interactions with objects in the virtual worlds, such that travel to a world, corresponding to an object to which an interaction was directed, is facilitated.
    Type: Grant
    Filed: November 17, 2022
    Date of Patent: March 12, 2024
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Jeremy Edelblut, Matthaeus Krenn, John Nicholas Jitkoff
  • Publication number: 20240054996
    Abstract: Systems and processes for operating an intelligent automated assistant are provided. In one example process, a speech input is received from a user. In response to determining that the speech input corresponds to a user intent of obtaining information associated with a user experience of the user, one or more parameters referencing a user experience of the user are identified. Metadata associated with the referenced user experience is obtained from an experiential data structure. Based on the metadata, one or more media items associated with the referenced are retrieved based on the metadata. The one or more media items associated with the referenced user experience are output together.
    Type: Application
    Filed: October 23, 2023
    Publication date: February 15, 2024
    Inventors: Marcos Regis VESCOVI, Eric M. G. CIRCLAEYS, Richard WARREN, Jeffrey Traer BERNSTEIN, Matthaeus KRENN
  • Patent number: 11900923
    Abstract: Systems and processes for operating an intelligent automated assistant are provided. In one example process, a speech input is received from a user. In response to determining that the speech input corresponds to a user intent of obtaining information associated with a user experience of the user, one or more parameters referencing a user experience of the user are identified. Metadata associated with the referenced user experience is obtained from an experiential data structure. Based on the metadata, one or more media items associated with the referenced are retrieved based on the metadata. The one or more media items associated with the referenced user experience are output together.
    Type: Grant
    Filed: September 7, 2021
    Date of Patent: February 13, 2024
    Assignee: Apple Inc.
    Inventors: Marcos Regis Vescovi, Eric M. G. Circlaeys, Richard Warren, Jeffrey Traer Bernstein, Matthaeus Krenn
  • Publication number: 20230418442
    Abstract: Methods and systems described herein are directed to a virtual web browser for providing access to multiple virtual worlds interchangeably. Browser tabs for corresponding website and virtual world pairs can be displayed along with associated controls, the selection of such controls effecting the instantiation of 3D content for the virtual worlds. One or more of the tabs can be automatically generated as a result of interactions with objects in the virtual worlds, such that travel to a world, corresponding to an object to which an interaction was directed, is facilitated.
    Type: Application
    Filed: November 17, 2022
    Publication date: December 28, 2023
    Inventors: Jeremy EDELBLUT, Matthaeus KRENN, John Nicholas JITKOFF
  • Publication number: 20230419618
    Abstract: Methods and systems described herein are directed to a virtual personal interface (herein “personal interface”) for controlling an artificial reality (XR) environment, such as by providing user interfaces for interactions with a current XR application, providing detail views for selected items, navigating between multiple virtual worlds without having to transition in and out of a home lobby for those worlds, executing aspects of a second XR application while within a world controlled by a first XR application, and providing 3D content that is separate from the current world. While in at least one of those worlds, the personal interface can itself present content in a runtime separate from the current virtual world, corresponding to an item, action, or application for that world. XR applications can be defined for use with the personal interface to create both a 3D world portion and 2D interface portions that are displayed via the personal interface.
    Type: Application
    Filed: November 17, 2022
    Publication date: December 28, 2023
    Inventors: Matthaeus KRENN, Jeremy EDELBLUT, John Nicholas JITKOFF
  • Publication number: 20230419617
    Abstract: Methods and systems described herein are directed to a virtual personal interface (herein “personal interface”) for controlling an artificial reality (XR) environment, such as by providing user interfaces for interactions with a current XR application, providing detail views for selected items, navigating between multiple virtual worlds without having to transition in and out of a home lobby for those worlds, executing aspects of a second XR application while within a world controlled by a first XR application, and providing 3D content that is separate from the current world. While in at least one of those worlds, the personal interface can itself present content in a runtime separate from the current virtual world, corresponding to an item, action, or application for that world. XR applications can be defined for use with the personal interface to create both a 3D world portion and 2D interface portions that are displayed via the personal interface.
    Type: Application
    Filed: July 19, 2022
    Publication date: December 28, 2023
    Inventors: Matthaeus KRENN, Jeremy EDELBLUT, John Nicholas JITKOFF
  • Patent number: 11854539
    Abstract: Systems and processes for operating an intelligent automated assistant are provided. In one example process, a speech input is received from a user. In response to determining that the speech input corresponds to a user intent of obtaining information associated with a user experience of the user, one or more parameters referencing a user experience of the user are identified. Metadata associated with the referenced user experience is obtained from an experiential data structure. Based on the metadata, one or more media items associated with the referenced are retrieved based on the metadata. The one or more media items associated with the referenced user experience are output together.
    Type: Grant
    Filed: August 11, 2020
    Date of Patent: December 26, 2023
    Assignee: Apple Inc.
    Inventors: Marcos Regis Vescovi, Eric M. G. Circlaeys, Richard Warren, Jeffrey Traer Bernstein, Matthaeus Krenn
  • Publication number: 20230393705
    Abstract: An electronic device displays a messaging interface that allows a participant in a message conversation to capture, send, and/or play media content. The media content includes images, video, and/or audio. The media content is captured, sent, and/or played based on the electronic device detecting one or more conditions.
    Type: Application
    Filed: August 23, 2023
    Publication date: December 7, 2023
    Inventor: Matthaeus KRENN
  • Patent number: 11818455
    Abstract: A first device sends a request to a second device to initiate a shared annotation session. In response to receiving acceptance of the request, a first prompt to move the first device toward the second device is displayed. In accordance with a determination that connection criteria for the first device and the second device are met, a representation of a field of view of the camera(s) of the first device is displayed in the shared annotation session with the second device. During the shared annotation session, one or more annotations are displayed via the first display generation component and one or more second virtual annotations corresponding to annotation input directed to the respective location in the physical environment by the second device is displayed via the first display generation component, provided that the respective location is included in the field of view of the first set of cameras.
    Type: Grant
    Filed: February 8, 2023
    Date of Patent: November 14, 2023
    Assignee: APPLE INC.
    Inventors: Joseph A. Malia, Mark K. Hauenstein, Praveen Sharma, Matan Stauber, Julian K. Missig, Jeffrey T. Bernstein, Lukas Robert Tom Girling, Matthaeus Krenn
  • Publication number: 20230305674
    Abstract: A computer system displays, in a first viewing mode, a simulated environment that is oriented relative to a physical environment of the computer system. In response to detecting a first change in attitude, the computer system changes an appearance of a first virtual user interface object so as to maintain a fixed spatial relationship between the first virtual user interface object and the physical environment. The computing system detects a gesture. In response to detecting a second change in attitude, in accordance with a determination that the gesture met mode change criteria, the computer system transitions from displaying the simulated environment in the first viewing mode to displaying the simulated environment in a second viewing mode. Displaying the virtual model in the simulated environment in the second viewing mode includes forgoing changing the appearance of the first virtual user interface object to maintain the fixed spatial relationship.
    Type: Application
    Filed: April 27, 2023
    Publication date: September 28, 2023
    Inventors: Mark K. Hauenstein, Joseph A. Malia, Julian K. Missig, Matthaeus Krenn, Jeffrey T. Bernstein
  • Patent number: 11755180
    Abstract: Methods and systems described herein are directed to a virtual web browser for providing access to multiple virtual worlds interchangeably. Browser tabs for corresponding website and virtual world pairs can be displayed along with associated controls, the selection of such controls effecting the instantiation of 3D content for the virtual worlds. One or more of the tabs can be automatically generated as a result of interactions with objects in the virtual worlds, such that travel to a world, corresponding to an object to which an interaction was directed, is facilitated.
    Type: Grant
    Filed: August 18, 2022
    Date of Patent: September 12, 2023
    Inventors: Jeremy Edelblut, Matthaeus Krenn, John Nicholas Jitkoff
  • Patent number: 11740755
    Abstract: A computer system while displaying an augmented reality environment, concurrently displays: a representation of at least a portion of a field of view of one or more cameras that includes a physical object, and a virtual user interface object at a location in the representation of the field of view, where the location is determined based on the respective physical object in the field of view. While displaying the augmented reality environment, in response to detecting an input that changes a virtual environment setting for the augmented reality environment, the computer system adjusts an appearance of the virtual user interface object in accordance with the change made to the virtual environment setting and applies to at least a portion of the representation of the field of view a filter selected based on the change made to the virtual environment setting.
    Type: Grant
    Filed: September 28, 2021
    Date of Patent: August 29, 2023
    Assignee: APPLE INC.
    Inventors: Mark K. Hauenstein, Joseph A. Malia, Julian K. Missig, Matthaeus Krenn, Jeffrey T. Bernstein
  • Publication number: 20230252659
    Abstract: The present disclosure generally relates to displaying and editing an image with depth information. In response to an input, an object in the image having a one or more elements in a first depth range is identified. The identified object is then isolated from other elements in the image and displayed separately from the other elements. The isolated object may then be utilized in different applications.
    Type: Application
    Filed: April 20, 2023
    Publication date: August 10, 2023
    Inventors: Matan STAUBER, Amir HOFFNUNG, Matthaeus KRENN, Jeffrey Traer BERNSTEIN, Joseph A. MALIA, Mark HAUENSTEIN
  • Publication number: 20230199296
    Abstract: A first device sends a request to a second device to initiate a shared annotation session. In response to receiving acceptance of the request, a first prompt to move the first device toward the second device is displayed. In accordance with a determination that connection criteria for the first device and the second device are met, a representation of a field of view of the camera(s) of the first device is displayed in the shared annotation session with the second device. During the shared annotation session, one or more annotations are displayed via the first display generation component and one or more second virtual annotations corresponding to annotation input directed to the respective location in the physical environment by the second device is displayed via the first display generation component, provided that the respective location is included in the field of view of the first set of cameras.
    Type: Application
    Filed: February 8, 2023
    Publication date: June 22, 2023
    Inventors: Joseph A. Malia, Mark K. Hauenstein, Praveen Sharma, Matan Stauber, Julian K. Missig, Jeffrey T. Bernstein, Lukas Robert Tom Girling, Matthaeus Krenn