Patents by Inventor Aaron M. Burns
Aaron M. Burns has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250251791Abstract: A method includes determining a first gaze position within a content manipulation region based on eye tracking data from an eye tracker. The method includes determining a selection point associated with a physical surface, based on spatial selector data from a spatial selector tracker. The method includes displaying a computer-generated representation of a trackpad on the physical surface, based on the first gaze position and the selection point. In some implementations, a method includes displaying a user interface including a plurality of content manipulation regions, a plurality of affordances respectively associated with the plurality of content manipulation regions, and a computer-generated representation of a trackpad. The method includes detecting, via a spatial selector tracker, an input directed to a first affordance associated with a first content manipulation region.Type: ApplicationFiled: April 25, 2025Publication date: August 7, 2025Inventors: Anette L. Freiin von Kapri, Aaron M. Burns, Benjamin R. Blachnitsky
-
Patent number: 12373081Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.Type: GrantFiled: June 10, 2024Date of Patent: July 29, 2025Assignee: Apple Inc.Inventors: Benjamin Hylak, Aaron M. Burns, Nathan Gitter, Jordan A Cazamias, Alexis H. Palangie, James J. Owen
-
Patent number: 12333662Abstract: An electronic device such as a head-mounted device may present extended reality content such as a representation of a three-dimensional environment. The representation of the three-dimensional environment may be changed between different viewing modes having different immersion levels in response to user input. The three-dimensional environment may represent a multiuser communication session. A multiuser communication session may be saved and subsequently viewed as a replay. There may be an interactive virtual object within the replay of the multiuser communication session. The pose of the interactive virtual object may be manipulated by a user while the replay is paused. Some multiuser communication sessions may be hierarchical multiuser communication sessions with a presenter and audience members. The presenter and audience members may receive generalized feedback based on the audience members during the presentation.Type: GrantFiled: June 30, 2023Date of Patent: June 17, 2025Assignee: Apple Inc.Inventors: Aaron M Burns, Adam G Poulos, Alexis H Palangie, Benjamin R Blachnitzky, Charilaos Papadopoulos, David M Schattel, Ezgi Demirayak, Jia Wang, Reza Abbasian, Ryan S Carlin
-
Patent number: 12321563Abstract: Methods for displaying and organizing user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can be grouped together into a container. In some embodiments, user interfaces can be added to a container, removed from a container, or moved from one location in the container to another location. In some embodiments, a visual indication is displayed before a user interface is added to a container. In some embodiments, a user interface can replace an existing user interface in a container. In some embodiments, when moving a user interface in a computer-generated environment, the transparency of a user interface that is obscured can be modified.Type: GrantFiled: November 20, 2023Date of Patent: June 3, 2025Assignee: Apple Inc.Inventors: Alexis H. Palangie, Aaron M. Burns, Benjamin Hylak
-
Publication number: 20250165070Abstract: A method is performed at an electronic device with one or more processors, a non-transitory memory, a display, an extremity tracking system, and a communication interface provided to communicate with a finger-wearable device. The method includes displaying a computer-generated object on the display. The method includes obtaining finger manipulation data from the finger-wearable device via the communication interface. The method includes determining a multi-finger gesture based on extremity tracking data from the extremity tracking system and the finger manipulation data. The method includes registering an engagement event with respect to the computer-generated object according to the multi-finger gesture.Type: ApplicationFiled: January 17, 2025Publication date: May 22, 2025Inventors: Aaron M. Burns, Adam G. Poulos, Arun Rakesh Yoganandan, Benjamin Hylak, Benjamin R. Blachnitzky, Nicolai Georg
-
Patent number: 12299267Abstract: Methods for displaying user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can have different immersion levels. In some embodiments, a user interface can have a respective immersion level based on its location in the three-dimensional environment or distance from the user. In some embodiments, a user interface can have a respective immersion level based on the state of the user interface. In some embodiments, a user interface can switch from one immersion level to another in response to the user's interaction with the user interface.Type: GrantFiled: May 17, 2024Date of Patent: May 13, 2025Assignee: Apple Inc.Inventors: Benjamin Hylak, Alexis H. Palangie, Jordan A. Cazamias, Nathan Gitter, Aaron M. Burns
-
Publication number: 20250118034Abstract: Various implementations disclosed herein include devices, systems, and methods that enable a device to provide a view of virtual elements and a physical environment where the presentation of the virtual elements is based on positioning relative to the physical environment. In one example, a device is configured to detect a change in positioning of a virtual element, for example, when a virtual element is added, moved, or the physical environment around the virtual element is changed. The location of the virtual element in the physical environment is used to detect an attribute of the physical environment upon which the presentation of the virtual element depends. Thus, the device is further configured to detect an attribute (e.g., surface, table, mid-air, etc.) of the physical environment based on the placement of the virtual element and present the virtual element based on the detected attribute.Type: ApplicationFiled: December 17, 2024Publication date: April 10, 2025Inventors: Aaron M. Burns, Bruno M. Sommer, Timothy R. Oriol
-
Patent number: 12254127Abstract: In accordance with some embodiments, an exemplary process for controlling representations of virtual objects based on respective user contexts that each correspond to different respective locations in a computer-generated reality (CGR) environment is described.Type: GrantFiled: December 27, 2023Date of Patent: March 18, 2025Assignee: Apple Inc.Inventors: Aaron M. Burns, Nathan Gitter, Alexis H. Palangie, Pol Pla I. Conesa, David M. Schattel
-
Patent number: 12242668Abstract: A method is performed at an electronic device with one or more processors, a non-transitory memory, a display, an extremity tracking system, and a communication interface provided to communicate with a finger-wearable device. The method includes displaying a computer-generated object on the display. The method includes obtaining finger manipulation data from the finger-wearable device via the communication interface. The method includes determining a multi-finger gesture based on extremity tracking data from the extremity tracking system and the finger manipulation data. The method includes registering an engagement event with respect to the computer-generated object according to the multi-finger gesture.Type: GrantFiled: March 20, 2023Date of Patent: March 4, 2025Assignee: APPLE INC.Inventors: Aaron M. Burns, Adam G. Poulos, Arun Rakesh Yoganandan, Benjamin Hylak, Benjamin R. Blachnitzky, Nicolai Georg
-
Patent number: 12211152Abstract: Various implementations disclosed herein include devices, systems, and methods that enable a device to provide a view of virtual elements and a physical environment where the presentation of the virtual elements is based on positioning relative to the physical environment. In one example, a device is configured to detect a change in positioning of a virtual element, for example, when a virtual element is added, moved, or the physical environment around the virtual element is changed. The location of the virtual element in the physical environment is used to detect an attribute of the physical environment upon which the presentation of the virtual element depends. Thus, the device is further configured to detect an attribute (e.g., surface, table, mid-air, etc.) of the physical environment based on the placement of the virtual element and present the virtual element based on the detected attribute.Type: GrantFiled: April 22, 2022Date of Patent: January 28, 2025Assignee: Apple Inc.Inventors: Aaron M. Burns, Bruno M. Sommer, Timothy R. Oriol
-
Patent number: 12189853Abstract: A method includes displaying a plurality of computer-generated objects, and obtaining finger manipulation data from a finger-wearable device via a communication interface. In some implementations, the method includes receiving an untethered input vector that includes a plurality of untethered input indicator values. Each of the plurality of untethered input indicator values is associated with one of a plurality of untethered input modalities. In some implementations, the method includes obtaining proxy object manipulation data from a physical proxy object via the communication interface. The proxy object manipulation data corresponds to sensor data associated with one or more sensors integrated in the physical proxy object. The method includes registering an engagement event with respect to a first one of the plurality of computer-generated objects based on a combination of the finger manipulation data, the untethered input vector, and the proxy object manipulation data.Type: GrantFiled: March 11, 2024Date of Patent: January 7, 2025Assignee: APPLE INC.Inventors: Adam G Poulos, Aaron M. Burns, Arun Rakesh Yoganandan, Benjamin R. Blachnitzky, Nicolai Georg
-
Publication number: 20250004581Abstract: In one implementation. a method for dynamically selecting an operation modality for a physical object. The method includes: obtaining a user input vector that includes at least one user input indicator value associated with one of a plurality of different input modalities; obtaining tracking data associated with a physical object; generating a first characterization vector for the physical object, including a pose value and a user grip value, based on the user input vector and the tracking data, wherein the pose value characterizes a spatial relationship between the physical object and a user of the computing system and the user grip value characterizes a manner in which the physical object is being held by the user; and selecting, based on the first characterization vector, a first operation modality as a current operation modality for the physical object.Type: ApplicationFiled: July 1, 2022Publication date: January 2, 2025Inventors: Aaron M. Burns, Anette L. Freiin von Kapri, Arun Rakesh Yoganandan, Benjamin R. Blachnitzky, Christopher L. Nolet, David M. Schattel, Samantha Koire
-
Patent number: 12158988Abstract: A method is performed at an electronic device with one or more processors, a non-transitory memory, a display, and an extremity tracker. The method includes obtaining extremity tracking data via the extremity tracker. The method includes displaying a computer-generated representation of a trackpad that is spatially associated with a physical surface. The physical surface is viewable within the display along with a content manipulation region that is separate from the computer-generated representation of the trackpad. The method includes identifying a first location within the computer-generated representation of the trackpad based on the extremity tracking data. The method includes mapping the first location to a corresponding location within the content manipulation region. The method includes displaying an indicator indicative of the mapping. The indicator may overlap the corresponding location within the content manipulation region.Type: GrantFiled: February 27, 2023Date of Patent: December 3, 2024Assignee: APPLE INC.Inventors: Adam G. Poulos, Aaron M. Burns, Arun Rakesh Yoganandan, Benjamin R. Blachnitzky, Nicolai Georg
-
Patent number: 12118182Abstract: In accordance with some embodiments, an exemplary process for controlling the generation and display of suggested additional content based on a context of a workspace is described.Type: GrantFiled: January 11, 2023Date of Patent: October 15, 2024Assignee: Apple Inc.Inventors: Aaron M. Burns, Scott M. Andrus, David M. Schattel
-
Patent number: 12039142Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.Type: GrantFiled: December 25, 2022Date of Patent: July 16, 2024Assignee: Apple Inc.Inventors: Benjamin Hylak, Aaron M. Burns, Nathan Gitter, Jordan A. Cazamias, Alexis H. Palangie
-
Publication number: 20240231486Abstract: A method includes determining a first gaze position within a content manipulation region based on eye tracking data from an eye tracker. The method includes determining a selection point associated with a physical surface, based on spatial selector data from a spatial selector tracker. The method includes displaying a computer-generated representation of a trackpad on the physical surface, based on the first gaze position and the selection point. In some implementations, a method includes displaying a user interface including a plurality of content manipulation regions, a plurality of affordances respectively associated with the plurality of content manipulation regions, and a computer-generated representation of a trackpad. The method includes detecting, via a spatial selector tracker, an input directed to a first affordance associated with a first content manipulation region.Type: ApplicationFiled: March 22, 2024Publication date: July 11, 2024Inventors: Anette L. Freiin von Kapri, Aaron M. Burns, Benjamin R. Blachnitsky
-
Publication number: 20240221301Abstract: Various implementations disclosed herein provide augmentations in extended reality (XR) using sensor data from a user worn device. The sensor data may be used understand that a user's state is associated with providing user assistance, e.g., a user's appearance or behavior or an understanding of the environment may be used to recognize a need or desire for user assistance. The augmentations may assist the user by enhancing or supplementing the user's abilities, e.g., providing guidance or other information about an environment to disabled/impaired person.Type: ApplicationFiled: December 28, 2023Publication date: July 4, 2024Inventors: Aaron M. Burns, Benjamin R. Blachnitzky, Laura Sugden, Charilaos Papadopoulos, James T. Turner
-
Publication number: 20240211044Abstract: A method includes displaying a plurality of computer-generated objects, and obtaining finger manipulation data from a finger-wearable device via a communication interface. In some implementations, the method includes receiving an untethered input vector that includes a plurality of untethered input indicator values. Each of the plurality of untethered input indicator values is associated with one of a plurality of untethered input modalities. In some implementations, the method includes obtaining proxy object manipulation data from a physical proxy object via the communication interface. The proxy object manipulation data corresponds to sensor data associated with one or more sensors integrated in the physical proxy object. The method includes registering an engagement event with respect to a first one of the plurality of computer-generated objects based on a combination of the finger manipulation data, the untethered input vector, and the proxy object manipulation data.Type: ApplicationFiled: March 11, 2024Publication date: June 27, 2024Inventors: Adam G. Poulos, Aaron M. Burns, Arun Rakesh Yoganandan, Benjamin R. Blachnitzky, Nicolai Georg
-
Publication number: 20240203276Abstract: In one implementation, a method of providing audience feedback during a performance of a presentation is performed at a device including a display, one or more processors, and non-transitory memory. The method includes displaying, on the display in association with an environment including a plurality of audience members, one or more slides of a presentation. The method includes, while displaying the one or more slides of the presentation, obtaining data regarding the plurality of audience members. The method includes displaying, on the display in association with the environment, one or more virtual objects based on the data regarding the plurality of audience members.Type: ApplicationFiled: April 6, 2022Publication date: June 20, 2024Inventors: Benjamin Hylak, Aaron M. Burns, Grant H. Mulliken, Mary A. Pyc, Nathan Gitter, Pau Sastre Miguel, Steven A. Marchette
-
Publication number: 20240193858Abstract: In one implementation, a method of assisting in the rehearsal of a presentation is performed at a device including a display, one or more processors, and non-transitory memory. The method includes obtaining a difficulty level for a rehearsal of a presentation. The method includes displaying, on the display, one or more slides of the presentation. The method includes displaying, on the display in association with a volumetric environment, one or more virtual objects based on the difficulty level.Type: ApplicationFiled: April 11, 2022Publication date: June 13, 2024Inventors: Benjamin Hylak, Aaron M. Burns, Grant H. Mulliken, Mary A. Pyc, Nathan Gitter, Pau Sastre Miguel, Steven A. Marchette