Patents by Inventor Aaron M. Burns

Aaron M. Burns has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250251791
    Abstract: A method includes determining a first gaze position within a content manipulation region based on eye tracking data from an eye tracker. The method includes determining a selection point associated with a physical surface, based on spatial selector data from a spatial selector tracker. The method includes displaying a computer-generated representation of a trackpad on the physical surface, based on the first gaze position and the selection point. In some implementations, a method includes displaying a user interface including a plurality of content manipulation regions, a plurality of affordances respectively associated with the plurality of content manipulation regions, and a computer-generated representation of a trackpad. The method includes detecting, via a spatial selector tracker, an input directed to a first affordance associated with a first content manipulation region.
    Type: Application
    Filed: April 25, 2025
    Publication date: August 7, 2025
    Inventors: Anette L. Freiin von Kapri, Aaron M. Burns, Benjamin R. Blachnitsky
  • Patent number: 12373081
    Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.
    Type: Grant
    Filed: June 10, 2024
    Date of Patent: July 29, 2025
    Assignee: Apple Inc.
    Inventors: Benjamin Hylak, Aaron M. Burns, Nathan Gitter, Jordan A Cazamias, Alexis H. Palangie, James J. Owen
  • Patent number: 12333662
    Abstract: An electronic device such as a head-mounted device may present extended reality content such as a representation of a three-dimensional environment. The representation of the three-dimensional environment may be changed between different viewing modes having different immersion levels in response to user input. The three-dimensional environment may represent a multiuser communication session. A multiuser communication session may be saved and subsequently viewed as a replay. There may be an interactive virtual object within the replay of the multiuser communication session. The pose of the interactive virtual object may be manipulated by a user while the replay is paused. Some multiuser communication sessions may be hierarchical multiuser communication sessions with a presenter and audience members. The presenter and audience members may receive generalized feedback based on the audience members during the presentation.
    Type: Grant
    Filed: June 30, 2023
    Date of Patent: June 17, 2025
    Assignee: Apple Inc.
    Inventors: Aaron M Burns, Adam G Poulos, Alexis H Palangie, Benjamin R Blachnitzky, Charilaos Papadopoulos, David M Schattel, Ezgi Demirayak, Jia Wang, Reza Abbasian, Ryan S Carlin
  • Patent number: 12321563
    Abstract: Methods for displaying and organizing user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can be grouped together into a container. In some embodiments, user interfaces can be added to a container, removed from a container, or moved from one location in the container to another location. In some embodiments, a visual indication is displayed before a user interface is added to a container. In some embodiments, a user interface can replace an existing user interface in a container. In some embodiments, when moving a user interface in a computer-generated environment, the transparency of a user interface that is obscured can be modified.
    Type: Grant
    Filed: November 20, 2023
    Date of Patent: June 3, 2025
    Assignee: Apple Inc.
    Inventors: Alexis H. Palangie, Aaron M. Burns, Benjamin Hylak
  • Publication number: 20250165070
    Abstract: A method is performed at an electronic device with one or more processors, a non-transitory memory, a display, an extremity tracking system, and a communication interface provided to communicate with a finger-wearable device. The method includes displaying a computer-generated object on the display. The method includes obtaining finger manipulation data from the finger-wearable device via the communication interface. The method includes determining a multi-finger gesture based on extremity tracking data from the extremity tracking system and the finger manipulation data. The method includes registering an engagement event with respect to the computer-generated object according to the multi-finger gesture.
    Type: Application
    Filed: January 17, 2025
    Publication date: May 22, 2025
    Inventors: Aaron M. Burns, Adam G. Poulos, Arun Rakesh Yoganandan, Benjamin Hylak, Benjamin R. Blachnitzky, Nicolai Georg
  • Patent number: 12299267
    Abstract: Methods for displaying user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can have different immersion levels. In some embodiments, a user interface can have a respective immersion level based on its location in the three-dimensional environment or distance from the user. In some embodiments, a user interface can have a respective immersion level based on the state of the user interface. In some embodiments, a user interface can switch from one immersion level to another in response to the user's interaction with the user interface.
    Type: Grant
    Filed: May 17, 2024
    Date of Patent: May 13, 2025
    Assignee: Apple Inc.
    Inventors: Benjamin Hylak, Alexis H. Palangie, Jordan A. Cazamias, Nathan Gitter, Aaron M. Burns
  • Publication number: 20250118034
    Abstract: Various implementations disclosed herein include devices, systems, and methods that enable a device to provide a view of virtual elements and a physical environment where the presentation of the virtual elements is based on positioning relative to the physical environment. In one example, a device is configured to detect a change in positioning of a virtual element, for example, when a virtual element is added, moved, or the physical environment around the virtual element is changed. The location of the virtual element in the physical environment is used to detect an attribute of the physical environment upon which the presentation of the virtual element depends. Thus, the device is further configured to detect an attribute (e.g., surface, table, mid-air, etc.) of the physical environment based on the placement of the virtual element and present the virtual element based on the detected attribute.
    Type: Application
    Filed: December 17, 2024
    Publication date: April 10, 2025
    Inventors: Aaron M. Burns, Bruno M. Sommer, Timothy R. Oriol
  • Patent number: 12254127
    Abstract: In accordance with some embodiments, an exemplary process for controlling representations of virtual objects based on respective user contexts that each correspond to different respective locations in a computer-generated reality (CGR) environment is described.
    Type: Grant
    Filed: December 27, 2023
    Date of Patent: March 18, 2025
    Assignee: Apple Inc.
    Inventors: Aaron M. Burns, Nathan Gitter, Alexis H. Palangie, Pol Pla I. Conesa, David M. Schattel
  • Patent number: 12242668
    Abstract: A method is performed at an electronic device with one or more processors, a non-transitory memory, a display, an extremity tracking system, and a communication interface provided to communicate with a finger-wearable device. The method includes displaying a computer-generated object on the display. The method includes obtaining finger manipulation data from the finger-wearable device via the communication interface. The method includes determining a multi-finger gesture based on extremity tracking data from the extremity tracking system and the finger manipulation data. The method includes registering an engagement event with respect to the computer-generated object according to the multi-finger gesture.
    Type: Grant
    Filed: March 20, 2023
    Date of Patent: March 4, 2025
    Assignee: APPLE INC.
    Inventors: Aaron M. Burns, Adam G. Poulos, Arun Rakesh Yoganandan, Benjamin Hylak, Benjamin R. Blachnitzky, Nicolai Georg
  • Patent number: 12211152
    Abstract: Various implementations disclosed herein include devices, systems, and methods that enable a device to provide a view of virtual elements and a physical environment where the presentation of the virtual elements is based on positioning relative to the physical environment. In one example, a device is configured to detect a change in positioning of a virtual element, for example, when a virtual element is added, moved, or the physical environment around the virtual element is changed. The location of the virtual element in the physical environment is used to detect an attribute of the physical environment upon which the presentation of the virtual element depends. Thus, the device is further configured to detect an attribute (e.g., surface, table, mid-air, etc.) of the physical environment based on the placement of the virtual element and present the virtual element based on the detected attribute.
    Type: Grant
    Filed: April 22, 2022
    Date of Patent: January 28, 2025
    Assignee: Apple Inc.
    Inventors: Aaron M. Burns, Bruno M. Sommer, Timothy R. Oriol
  • Patent number: 12189853
    Abstract: A method includes displaying a plurality of computer-generated objects, and obtaining finger manipulation data from a finger-wearable device via a communication interface. In some implementations, the method includes receiving an untethered input vector that includes a plurality of untethered input indicator values. Each of the plurality of untethered input indicator values is associated with one of a plurality of untethered input modalities. In some implementations, the method includes obtaining proxy object manipulation data from a physical proxy object via the communication interface. The proxy object manipulation data corresponds to sensor data associated with one or more sensors integrated in the physical proxy object. The method includes registering an engagement event with respect to a first one of the plurality of computer-generated objects based on a combination of the finger manipulation data, the untethered input vector, and the proxy object manipulation data.
    Type: Grant
    Filed: March 11, 2024
    Date of Patent: January 7, 2025
    Assignee: APPLE INC.
    Inventors: Adam G Poulos, Aaron M. Burns, Arun Rakesh Yoganandan, Benjamin R. Blachnitzky, Nicolai Georg
  • Publication number: 20250004581
    Abstract: In one implementation. a method for dynamically selecting an operation modality for a physical object. The method includes: obtaining a user input vector that includes at least one user input indicator value associated with one of a plurality of different input modalities; obtaining tracking data associated with a physical object; generating a first characterization vector for the physical object, including a pose value and a user grip value, based on the user input vector and the tracking data, wherein the pose value characterizes a spatial relationship between the physical object and a user of the computing system and the user grip value characterizes a manner in which the physical object is being held by the user; and selecting, based on the first characterization vector, a first operation modality as a current operation modality for the physical object.
    Type: Application
    Filed: July 1, 2022
    Publication date: January 2, 2025
    Inventors: Aaron M. Burns, Anette L. Freiin von Kapri, Arun Rakesh Yoganandan, Benjamin R. Blachnitzky, Christopher L. Nolet, David M. Schattel, Samantha Koire
  • Patent number: 12158988
    Abstract: A method is performed at an electronic device with one or more processors, a non-transitory memory, a display, and an extremity tracker. The method includes obtaining extremity tracking data via the extremity tracker. The method includes displaying a computer-generated representation of a trackpad that is spatially associated with a physical surface. The physical surface is viewable within the display along with a content manipulation region that is separate from the computer-generated representation of the trackpad. The method includes identifying a first location within the computer-generated representation of the trackpad based on the extremity tracking data. The method includes mapping the first location to a corresponding location within the content manipulation region. The method includes displaying an indicator indicative of the mapping. The indicator may overlap the corresponding location within the content manipulation region.
    Type: Grant
    Filed: February 27, 2023
    Date of Patent: December 3, 2024
    Assignee: APPLE INC.
    Inventors: Adam G. Poulos, Aaron M. Burns, Arun Rakesh Yoganandan, Benjamin R. Blachnitzky, Nicolai Georg
  • Patent number: 12118182
    Abstract: In accordance with some embodiments, an exemplary process for controlling the generation and display of suggested additional content based on a context of a workspace is described.
    Type: Grant
    Filed: January 11, 2023
    Date of Patent: October 15, 2024
    Assignee: Apple Inc.
    Inventors: Aaron M. Burns, Scott M. Andrus, David M. Schattel
  • Patent number: 12039142
    Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.
    Type: Grant
    Filed: December 25, 2022
    Date of Patent: July 16, 2024
    Assignee: Apple Inc.
    Inventors: Benjamin Hylak, Aaron M. Burns, Nathan Gitter, Jordan A. Cazamias, Alexis H. Palangie
  • Publication number: 20240231486
    Abstract: A method includes determining a first gaze position within a content manipulation region based on eye tracking data from an eye tracker. The method includes determining a selection point associated with a physical surface, based on spatial selector data from a spatial selector tracker. The method includes displaying a computer-generated representation of a trackpad on the physical surface, based on the first gaze position and the selection point. In some implementations, a method includes displaying a user interface including a plurality of content manipulation regions, a plurality of affordances respectively associated with the plurality of content manipulation regions, and a computer-generated representation of a trackpad. The method includes detecting, via a spatial selector tracker, an input directed to a first affordance associated with a first content manipulation region.
    Type: Application
    Filed: March 22, 2024
    Publication date: July 11, 2024
    Inventors: Anette L. Freiin von Kapri, Aaron M. Burns, Benjamin R. Blachnitsky
  • Publication number: 20240221301
    Abstract: Various implementations disclosed herein provide augmentations in extended reality (XR) using sensor data from a user worn device. The sensor data may be used understand that a user's state is associated with providing user assistance, e.g., a user's appearance or behavior or an understanding of the environment may be used to recognize a need or desire for user assistance. The augmentations may assist the user by enhancing or supplementing the user's abilities, e.g., providing guidance or other information about an environment to disabled/impaired person.
    Type: Application
    Filed: December 28, 2023
    Publication date: July 4, 2024
    Inventors: Aaron M. Burns, Benjamin R. Blachnitzky, Laura Sugden, Charilaos Papadopoulos, James T. Turner
  • Publication number: 20240211044
    Abstract: A method includes displaying a plurality of computer-generated objects, and obtaining finger manipulation data from a finger-wearable device via a communication interface. In some implementations, the method includes receiving an untethered input vector that includes a plurality of untethered input indicator values. Each of the plurality of untethered input indicator values is associated with one of a plurality of untethered input modalities. In some implementations, the method includes obtaining proxy object manipulation data from a physical proxy object via the communication interface. The proxy object manipulation data corresponds to sensor data associated with one or more sensors integrated in the physical proxy object. The method includes registering an engagement event with respect to a first one of the plurality of computer-generated objects based on a combination of the finger manipulation data, the untethered input vector, and the proxy object manipulation data.
    Type: Application
    Filed: March 11, 2024
    Publication date: June 27, 2024
    Inventors: Adam G. Poulos, Aaron M. Burns, Arun Rakesh Yoganandan, Benjamin R. Blachnitzky, Nicolai Georg
  • Publication number: 20240203276
    Abstract: In one implementation, a method of providing audience feedback during a performance of a presentation is performed at a device including a display, one or more processors, and non-transitory memory. The method includes displaying, on the display in association with an environment including a plurality of audience members, one or more slides of a presentation. The method includes, while displaying the one or more slides of the presentation, obtaining data regarding the plurality of audience members. The method includes displaying, on the display in association with the environment, one or more virtual objects based on the data regarding the plurality of audience members.
    Type: Application
    Filed: April 6, 2022
    Publication date: June 20, 2024
    Inventors: Benjamin Hylak, Aaron M. Burns, Grant H. Mulliken, Mary A. Pyc, Nathan Gitter, Pau Sastre Miguel, Steven A. Marchette
  • Publication number: 20240193858
    Abstract: In one implementation, a method of assisting in the rehearsal of a presentation is performed at a device including a display, one or more processors, and non-transitory memory. The method includes obtaining a difficulty level for a rehearsal of a presentation. The method includes displaying, on the display, one or more slides of the presentation. The method includes displaying, on the display in association with a volumetric environment, one or more virtual objects based on the difficulty level.
    Type: Application
    Filed: April 11, 2022
    Publication date: June 13, 2024
    Inventors: Benjamin Hylak, Aaron M. Burns, Grant H. Mulliken, Mary A. Pyc, Nathan Gitter, Pau Sastre Miguel, Steven A. Marchette