Patents by Inventor Matthaeus KRENN
Matthaeus KRENN has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12266061Abstract: Methods and systems described herein are directed to a virtual personal interface (herein “personal interface”) for controlling an artificial reality (XR) environment, such as by providing user interfaces for interactions with a current XR application, providing detail views for selected items, navigating between multiple virtual worlds without having to transition in and out of a home lobby for those worlds, executing aspects of a second XR application while within a world controlled by a first XR application, and providing 3D content that is separate from the current world. While in at least one of those worlds, the personal interface can itself present content in a runtime separate from the current virtual world, corresponding to an item, action, or application for that world. XR applications can be defined for use with the personal interface to create both a 3D world portion and 2D interface portions that are displayed via the personal interface.Type: GrantFiled: November 17, 2022Date of Patent: April 1, 2025Assignee: Meta Platforms Technologies, LLCInventors: Matthaeus Krenn, Jeremy Edelblut, John Nicholas Jitkoff
-
Publication number: 20250080830Abstract: A computer system displays virtual objects overlaid on a view of a physical environment as a virtual effect. The computer system displays respective animated movements of the virtual objects over the view of the physical environment, wherein the respective animated movements are constrained in accordance with a direction of simulated gravity associated with the view of the physical environment. If current positions of virtual objects during the respective animated movement of the virtual objects corresponds to different surfaces at different heights detected in the view of the physical environment, the computer constrains the respective animated movements of the virtual objects in accordance with the different surfaces detected in the view of the physical environment.Type: ApplicationFiled: October 28, 2024Publication date: March 6, 2025Inventors: Joseph A. Malia, Mark K. Hauenstein, Praveen Sharma, Matan Stauber, Julian K. Missig, Jeffrey T. Bernstein, Lukas Robert Tom Girling, Matthaeus Krenn
-
Patent number: 12131417Abstract: A computer system displays virtual objects overlaid on a view of a physical environment as a virtual effect. The computer system displays respective animated movements of the virtual objects over the view of the physical environment, wherein the respective animated movements are constrained in accordance with a direction of simulated gravity associated with the view of the physical environment. If current positions of virtual objects during the respective animated movement of the virtual objects corresponds to different surfaces at different heights detected in the view of the physical environment, the computer constrains the respective animated movements of the virtual objects in accordance with the different surfaces detected in the view of the physical environment.Type: GrantFiled: November 8, 2023Date of Patent: October 29, 2024Assignee: APPLE INC.Inventors: Joseph A. Malia, Mark K. Hauenstein, Praveen Sharma, Matan Stauber, Julian K. Missig, Jeffrey T. Bernstein, Lukas Robert Tom Girling, Matthaeus Krenn
-
Publication number: 20240265656Abstract: A computer implemented method for facilitating system user interface (UI) interactions in an artificial reality (XR) environment is provided. The method includes rendering the system UI in the XR environment as a 3D virtual element. The method further includes tracking a position of a hand of a user and a pre-defined stable point on the user. The method further includes identifying, based on the tracking, that the hand has grasped a portion of the system UI and, in response, rotating the position of the system UI around the grasped portion of the system UI such that a line, between the stable point and the surface of the system UI, is moved, to be perpendicular or at a predefined angle from perpendicular to the surface of the system UI, as the user moves the system UI via the grasped portion.Type: ApplicationFiled: February 8, 2023Publication date: August 8, 2024Inventors: Anastasia VICTOR-FAICHNEY, Matthew Alan INSLEY, Samuel Matthew LEVATICH, Ahad Habib BASRAVI, Matthew LUTHER, Andrew C. JOHNSON, Matthaeus KRENN, Difei WANG, Irvi STEFO, Norah Riley SMITH
-
Publication number: 20240264660Abstract: A computer implemented method for facilitating user interface interactions in an XR environment is provided. The method incudes rendering a system UI and tracking a position of user's hand. The method further includes signifying an interaction opportunity by generating first feedback that modifies a UI element based on position of user's hand being within first threshold distance of the UI element or by generating second feedback that accentuates an edge of the system UI based on the position of user's hand being within second threshold distance of the edge. Furthermore, the method includes updating the position of the user's hand. The method further includes signifying interaction with the UI element by modifying location of representation of the user's hand, when the user's hand has interacted with the UI element or signifying interaction with the edge by generating third feedback that accentuates the portion of the representation that grabs the edge.Type: ApplicationFiled: February 8, 2023Publication date: August 8, 2024Inventors: Samuel Matthew LEVATICH, Matthew Alan INSLEY, Andrew C. JOHNSON, Qi XIONG, Jeremy EDELBLUT, Matthaeus KRENN, John Nicholas JITKOFF, Jennifer MORROW, Brandon FURTWANGLER
-
Publication number: 20240160337Abstract: Methods and systems described herein are directed to a virtual web browser for providing access to multiple virtual worlds interchangeably. Browser tabs for corresponding website and virtual world pairs can be displayed along with associated controls, the selection of such controls effecting the instantiation of 3D content for the virtual worlds. One or more of the tabs can be automatically generated as a result of interactions with objects in the virtual worlds, such that travel to a world, corresponding to an object to which an interaction was directed, is facilitated.Type: ApplicationFiled: January 25, 2024Publication date: May 16, 2024Inventors: Jeremy EDELBLUT, Matthaeus KRENN, John Nicholas JITKOFF
-
Patent number: 11928314Abstract: Methods and systems described herein are directed to a virtual web browser for providing access to multiple virtual worlds interchangeably. Browser tabs for corresponding website and virtual world pairs can be displayed along with associated controls, the selection of such controls effecting the instantiation of 3D content for the virtual worlds. One or more of the tabs can be automatically generated as a result of interactions with objects in the virtual worlds, such that travel to a world, corresponding to an object to which an interaction was directed, is facilitated.Type: GrantFiled: November 17, 2022Date of Patent: March 12, 2024Assignee: Meta Platforms Technologies, LLCInventors: Jeremy Edelblut, Matthaeus Krenn, John Nicholas Jitkoff
-
Publication number: 20240054996Abstract: Systems and processes for operating an intelligent automated assistant are provided. In one example process, a speech input is received from a user. In response to determining that the speech input corresponds to a user intent of obtaining information associated with a user experience of the user, one or more parameters referencing a user experience of the user are identified. Metadata associated with the referenced user experience is obtained from an experiential data structure. Based on the metadata, one or more media items associated with the referenced are retrieved based on the metadata. The one or more media items associated with the referenced user experience are output together.Type: ApplicationFiled: October 23, 2023Publication date: February 15, 2024Inventors: Marcos Regis VESCOVI, Eric M. G. CIRCLAEYS, Richard WARREN, Jeffrey Traer BERNSTEIN, Matthaeus KRENN
-
Patent number: 11900923Abstract: Systems and processes for operating an intelligent automated assistant are provided. In one example process, a speech input is received from a user. In response to determining that the speech input corresponds to a user intent of obtaining information associated with a user experience of the user, one or more parameters referencing a user experience of the user are identified. Metadata associated with the referenced user experience is obtained from an experiential data structure. Based on the metadata, one or more media items associated with the referenced are retrieved based on the metadata. The one or more media items associated with the referenced user experience are output together.Type: GrantFiled: September 7, 2021Date of Patent: February 13, 2024Assignee: Apple Inc.Inventors: Marcos Regis Vescovi, Eric M. G. Circlaeys, Richard Warren, Jeffrey Traer Bernstein, Matthaeus Krenn
-
Publication number: 20230418442Abstract: Methods and systems described herein are directed to a virtual web browser for providing access to multiple virtual worlds interchangeably. Browser tabs for corresponding website and virtual world pairs can be displayed along with associated controls, the selection of such controls effecting the instantiation of 3D content for the virtual worlds. One or more of the tabs can be automatically generated as a result of interactions with objects in the virtual worlds, such that travel to a world, corresponding to an object to which an interaction was directed, is facilitated.Type: ApplicationFiled: November 17, 2022Publication date: December 28, 2023Inventors: Jeremy EDELBLUT, Matthaeus KRENN, John Nicholas JITKOFF
-
Publication number: 20230419618Abstract: Methods and systems described herein are directed to a virtual personal interface (herein “personal interface”) for controlling an artificial reality (XR) environment, such as by providing user interfaces for interactions with a current XR application, providing detail views for selected items, navigating between multiple virtual worlds without having to transition in and out of a home lobby for those worlds, executing aspects of a second XR application while within a world controlled by a first XR application, and providing 3D content that is separate from the current world. While in at least one of those worlds, the personal interface can itself present content in a runtime separate from the current virtual world, corresponding to an item, action, or application for that world. XR applications can be defined for use with the personal interface to create both a 3D world portion and 2D interface portions that are displayed via the personal interface.Type: ApplicationFiled: November 17, 2022Publication date: December 28, 2023Inventors: Matthaeus KRENN, Jeremy EDELBLUT, John Nicholas JITKOFF
-
Publication number: 20230419617Abstract: Methods and systems described herein are directed to a virtual personal interface (herein “personal interface”) for controlling an artificial reality (XR) environment, such as by providing user interfaces for interactions with a current XR application, providing detail views for selected items, navigating between multiple virtual worlds without having to transition in and out of a home lobby for those worlds, executing aspects of a second XR application while within a world controlled by a first XR application, and providing 3D content that is separate from the current world. While in at least one of those worlds, the personal interface can itself present content in a runtime separate from the current virtual world, corresponding to an item, action, or application for that world. XR applications can be defined for use with the personal interface to create both a 3D world portion and 2D interface portions that are displayed via the personal interface.Type: ApplicationFiled: July 19, 2022Publication date: December 28, 2023Inventors: Matthaeus KRENN, Jeremy EDELBLUT, John Nicholas JITKOFF
-
Patent number: 11854539Abstract: Systems and processes for operating an intelligent automated assistant are provided. In one example process, a speech input is received from a user. In response to determining that the speech input corresponds to a user intent of obtaining information associated with a user experience of the user, one or more parameters referencing a user experience of the user are identified. Metadata associated with the referenced user experience is obtained from an experiential data structure. Based on the metadata, one or more media items associated with the referenced are retrieved based on the metadata. The one or more media items associated with the referenced user experience are output together.Type: GrantFiled: August 11, 2020Date of Patent: December 26, 2023Assignee: Apple Inc.Inventors: Marcos Regis Vescovi, Eric M. G. Circlaeys, Richard Warren, Jeffrey Traer Bernstein, Matthaeus Krenn
-
Publication number: 20230393705Abstract: An electronic device displays a messaging interface that allows a participant in a message conversation to capture, send, and/or play media content. The media content includes images, video, and/or audio. The media content is captured, sent, and/or played based on the electronic device detecting one or more conditions.Type: ApplicationFiled: August 23, 2023Publication date: December 7, 2023Inventor: Matthaeus KRENN
-
Patent number: 11818455Abstract: A first device sends a request to a second device to initiate a shared annotation session. In response to receiving acceptance of the request, a first prompt to move the first device toward the second device is displayed. In accordance with a determination that connection criteria for the first device and the second device are met, a representation of a field of view of the camera(s) of the first device is displayed in the shared annotation session with the second device. During the shared annotation session, one or more annotations are displayed via the first display generation component and one or more second virtual annotations corresponding to annotation input directed to the respective location in the physical environment by the second device is displayed via the first display generation component, provided that the respective location is included in the field of view of the first set of cameras.Type: GrantFiled: February 8, 2023Date of Patent: November 14, 2023Assignee: APPLE INC.Inventors: Joseph A. Malia, Mark K. Hauenstein, Praveen Sharma, Matan Stauber, Julian K. Missig, Jeffrey T. Bernstein, Lukas Robert Tom Girling, Matthaeus Krenn
-
Publication number: 20230305674Abstract: A computer system displays, in a first viewing mode, a simulated environment that is oriented relative to a physical environment of the computer system. In response to detecting a first change in attitude, the computer system changes an appearance of a first virtual user interface object so as to maintain a fixed spatial relationship between the first virtual user interface object and the physical environment. The computing system detects a gesture. In response to detecting a second change in attitude, in accordance with a determination that the gesture met mode change criteria, the computer system transitions from displaying the simulated environment in the first viewing mode to displaying the simulated environment in a second viewing mode. Displaying the virtual model in the simulated environment in the second viewing mode includes forgoing changing the appearance of the first virtual user interface object to maintain the fixed spatial relationship.Type: ApplicationFiled: April 27, 2023Publication date: September 28, 2023Inventors: Mark K. Hauenstein, Joseph A. Malia, Julian K. Missig, Matthaeus Krenn, Jeffrey T. Bernstein
-
Patent number: 11755180Abstract: Methods and systems described herein are directed to a virtual web browser for providing access to multiple virtual worlds interchangeably. Browser tabs for corresponding website and virtual world pairs can be displayed along with associated controls, the selection of such controls effecting the instantiation of 3D content for the virtual worlds. One or more of the tabs can be automatically generated as a result of interactions with objects in the virtual worlds, such that travel to a world, corresponding to an object to which an interaction was directed, is facilitated.Type: GrantFiled: August 18, 2022Date of Patent: September 12, 2023Inventors: Jeremy Edelblut, Matthaeus Krenn, John Nicholas Jitkoff
-
Patent number: 11740755Abstract: A computer system while displaying an augmented reality environment, concurrently displays: a representation of at least a portion of a field of view of one or more cameras that includes a physical object, and a virtual user interface object at a location in the representation of the field of view, where the location is determined based on the respective physical object in the field of view. While displaying the augmented reality environment, in response to detecting an input that changes a virtual environment setting for the augmented reality environment, the computer system adjusts an appearance of the virtual user interface object in accordance with the change made to the virtual environment setting and applies to at least a portion of the representation of the field of view a filter selected based on the change made to the virtual environment setting.Type: GrantFiled: September 28, 2021Date of Patent: August 29, 2023Assignee: APPLE INC.Inventors: Mark K. Hauenstein, Joseph A. Malia, Julian K. Missig, Matthaeus Krenn, Jeffrey T. Bernstein
-
Publication number: 20230252659Abstract: The present disclosure generally relates to displaying and editing an image with depth information. In response to an input, an object in the image having a one or more elements in a first depth range is identified. The identified object is then isolated from other elements in the image and displayed separately from the other elements. The isolated object may then be utilized in different applications.Type: ApplicationFiled: April 20, 2023Publication date: August 10, 2023Inventors: Matan STAUBER, Amir HOFFNUNG, Matthaeus KRENN, Jeffrey Traer BERNSTEIN, Joseph A. MALIA, Mark HAUENSTEIN
-
Publication number: 20230199296Abstract: A first device sends a request to a second device to initiate a shared annotation session. In response to receiving acceptance of the request, a first prompt to move the first device toward the second device is displayed. In accordance with a determination that connection criteria for the first device and the second device are met, a representation of a field of view of the camera(s) of the first device is displayed in the shared annotation session with the second device. During the shared annotation session, one or more annotations are displayed via the first display generation component and one or more second virtual annotations corresponding to annotation input directed to the respective location in the physical environment by the second device is displayed via the first display generation component, provided that the respective location is included in the field of view of the first set of cameras.Type: ApplicationFiled: February 8, 2023Publication date: June 22, 2023Inventors: Joseph A. Malia, Mark K. Hauenstein, Praveen Sharma, Matan Stauber, Julian K. Missig, Jeffrey T. Bernstein, Lukas Robert Tom Girling, Matthaeus Krenn