Patents by Inventor Alexis H. PALANGIE

Alexis H. PALANGIE has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11995301
    Abstract: Methods for displaying user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can have different immersion levels. In some embodiments, a user interface can have a respective immersion level based on its location in the three-dimensional environment or distance from the user. In some embodiments, a user interface can have a respective immersion level based on the state of the user interface. In some embodiments, a user interface can switch from one immersion level to another in response to the user's interaction with the user interface.
    Type: Grant
    Filed: March 10, 2023
    Date of Patent: May 28, 2024
    Assignee: Apple Inc.
    Inventors: Benjamin Hylak, Alexis H. Palangie, Jordan A. Cazamias, Nathan Gitter, Aaron M. Burns
  • Publication number: 20240126362
    Abstract: In accordance with some embodiments, an exemplary process for controlling representations of virtual objects based on respective user contexts that each correspond to different respective locations in a computer-generated reality (CGR) environment is described.
    Type: Application
    Filed: December 27, 2023
    Publication date: April 18, 2024
    Inventors: Aaron M. BURNS, Nathan GITTER, Alexis H. PALANGIE, Pol PLA I. CONESA, David M. SCHATTEL
  • Publication number: 20240112649
    Abstract: Exemplary processes are described, including processes to move and/or resize user interface elements in a computer-generated reality environment.
    Type: Application
    Filed: December 14, 2023
    Publication date: April 4, 2024
    Inventors: Aaron Mackay BURNS, Alexis H. PALANGIE, Pol PLA I CONESA, David M. SCHATTEL
  • Publication number: 20240103636
    Abstract: In some embodiments, a computer system performs virtual object manipulation operations using respective portions of the user's body and/or input device(s). In some embodiments, a computer system manipulates a virtual object based on input from a hand of a user and/or a handheld device. In some embodiments, a computer system manipulates a virtual object directly or indirectly.
    Type: Application
    Filed: September 22, 2023
    Publication date: March 28, 2024
    Inventors: William D. LINDMEIER, Tony KOBAYASHI, Alexis H. PALANGIE, Carmine ELVEZIO, Matthew J. SUNDSTROM
  • Publication number: 20240086032
    Abstract: Methods for displaying and manipulating user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can be grouped together into a container. In some embodiments, a user interface that is a member of a container can be manipulated. In some embodiments, manipulating a user interface that is a member of a container can cause the other user interfaces in the same container to be manipulated. In some embodiments, manipulating user interfaces in a container can cause the user interfaces to change one or more orientation and/or rotate about one or more axes.
    Type: Application
    Filed: November 20, 2023
    Publication date: March 14, 2024
    Inventors: Alexis H. PALANGIE, Aaron M. BURNS
  • Publication number: 20240086031
    Abstract: Methods for displaying and organizing user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can be grouped together into a container. In some embodiments, user interfaces can be added to a container, removed from a container, or moved from one location in the container to another location. In some embodiments, a visual indication is displayed before a user interface is added to a container. In some embodiments, a user interface can replace an existing user interface in a container. In some embodiments, when moving a user interface in a computer-generated environment, the transparency of a user interface that is obscured can be modified.
    Type: Application
    Filed: November 20, 2023
    Publication date: March 14, 2024
    Inventors: Alexis H. PALANGIE, Aaron M. BURNS, Benjamin HYLAK
  • Publication number: 20240056492
    Abstract: An electronic device such as a head-mounted device may present extended reality content such as a representation of a three-dimensional environment. The representation of the three-dimensional environment may be changed between different viewing modes having different immersion levels in response to user input. The three-dimensional environment may represent a multiuser communication session. A multiuser communication session may be saved and subsequently viewed as a replay. There may be an interactive virtual object within the replay of the multiuser communication session. The pose of the interactive virtual object may be manipulated by a user while the replay is paused. Some multiuser communication sessions may be hierarchical multiuser communication sessions with a presenter and audience members. The presenter and audience members may receive generalized feedback based on the audience members during the presentation.
    Type: Application
    Filed: June 30, 2023
    Publication date: February 15, 2024
    Inventors: Aaron M Burns, Adam G Poulos, Alexis H Palangie, Benjamin R Blachnitzky, Charilaos Papadopoulos, David M Schattel, Ezgi Demirayak, Jia Wang, Reza Abbasian, Ryan S Carlin
  • Publication number: 20240054746
    Abstract: An electronic device such as a head-mounted device may present extended reality content such as a representation of a three-dimensional environment. The representation of the three-dimensional environment may be changed between different viewing modes having different immersion levels in response to user input. The three-dimensional environment may represent a multiuser communication session. A multiuser communication session may be saved and subsequently viewed as a replay. There may be an interactive virtual object within the replay of the multiuser communication session. The pose of the interactive virtual object may be manipulated by a user while the replay is paused. Some multiuser communication sessions may be hierarchical multiuser communication sessions with a presenter and audience members. The presenter and audience members may receive generalized feedback based on the audience members during the presentation.
    Type: Application
    Filed: June 30, 2023
    Publication date: February 15, 2024
    Inventors: Aaron M. Burns, Adam G. Poulos, Alexis H. Palangie, Benjamin R. Blachnitzky, Charilaos Papadopoulos, David M. Schattel, Ezgi Demirayak, Jia Wang, Reza Abbasian, Ryan S. Carlin
  • Publication number: 20240054736
    Abstract: An electronic device such as a head-mounted device may present extended reality content such as a representation of a three-dimensional environment. The representation of the three-dimensional environment may be changed between different viewing modes having different immersion levels in response to user input. The three-dimensional environment may represent a multiuser communication session. A multiuser communication session may be saved and subsequently viewed as a replay. There may be an interactive virtual object within the replay of the multiuser communication session. The pose of the interactive virtual object may be manipulated by a user while the replay is paused. Some multiuser communication sessions may be hierarchical multiuser communication sessions with a presenter and audience members. The presenter and audience members may receive generalized feedback based on the audience members during the presentation.
    Type: Application
    Filed: June 30, 2023
    Publication date: February 15, 2024
    Inventors: Aaron M. Burns, Adam G. Poulos, Alexis H. Palangie, Benjamin R. Blachnitzky, Charilaos Papadopoulos, David M. Schattel, Ezgi Demirayak, Jia Wang, Reza Abbasian, Ryan S. Carlin
  • Publication number: 20240045579
    Abstract: A representation displayed in a three-dimensional environment may be selected with different types of selection inputs. When a representation displayed in the three-dimensional environment is selected, an application corresponding to the representation may be launched in the three-dimensional environment in accordance with the type of selection received. In such embodiments, when the representation is selected with a first type of selection, the application corresponding to the selected representation may be launched in a first manner, and when the representation is selected with a second type of selection, the application corresponding to the selected representation may be launched in a second manner. In some embodiments, the manner in which the application is launched may determine whether one or more previously launched applications continue to be displayed or cease to be displayed in the three-dimensional environment.
    Type: Application
    Filed: December 21, 2021
    Publication date: February 8, 2024
    Inventors: Nathan GITTER, Aaron M. BURNS, Benjamin HYLAK, Jonathan R. DASCOLA, Alexis H. PALANGIE
  • Patent number: 11893964
    Abstract: In accordance with some embodiments, an exemplary process for dynamically controlling the size of a display based on a moving of a visual object meeting a criterion in a computer-generated reality (CGR) environment is described.
    Type: Grant
    Filed: January 20, 2023
    Date of Patent: February 6, 2024
    Assignee: Apple Inc.
    Inventors: Aaron Mackay Burns, Alexis H. Palangie, Pol Pla I Conesa, David M. Schattel
  • Publication number: 20240029371
    Abstract: Various implementations disclosed herein include devices, systems, and methods that enable presenting environments including visual representations of multiple applications. In one implementation, a method includes presenting a view of an environment at an electronic device on a display. The view includes visual representations of a plurality of applications. The method further includes determining to provide a first application with access to a control parameter. The control parameter is configured to modify at least a portion of the view of the environment with virtual content, and the portion of the view includes at least a portion of content outside of a view of a visual representation associated with the first application. The method further includes restricting access to the control parameter by other applications which prevents the other applications from modifying the at least the portion of the view of the environment via the control parameter.
    Type: Application
    Filed: September 29, 2023
    Publication date: January 25, 2024
    Inventors: Aaron M Burns, Alexis H. Palangie, Nathan Gitter, Pol Pla I. Conesa
  • Patent number: 11861056
    Abstract: In accordance with some embodiments, an exemplary process for controlling representations of virtual objects based on respective user contexts that each correspond to different respective locations in a computer-generated reality (CGR) environment is described.
    Type: Grant
    Filed: August 6, 2021
    Date of Patent: January 2, 2024
    Assignee: Apple Inc.
    Inventors: Aaron M. Burns, Nathan Gitter, Alexis H. Palangie, Pol Pla I Conesa, David M. Schattel
  • Publication number: 20230350537
    Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.
    Type: Application
    Filed: December 25, 2022
    Publication date: November 2, 2023
    Inventors: Benjamin HYLAK, Aaron M. BURNS, Nathan GITTER, Jordan A. CAZAMIAS, Alexis H. PALANGIE, James J. OWEN
  • Publication number: 20230334793
    Abstract: In accordance with some embodiments, an exemplary process for dynamically controlling the size of a display based on a moving of a visual object meeting a criterion in a computer-generated reality (CGR) environment is described.
    Type: Application
    Filed: January 20, 2023
    Publication date: October 19, 2023
    Inventors: Aaron Mackay BURNS, Alexis H. PALANGIE, Pol PLA I CONESA, David M. SCHATTEL
  • Publication number: 20230333645
    Abstract: In one implementation, a method of processing input for multiple devices is performed by a first electronic device one or more processors and non-transitory memory. The method includes determining a gaze direction. The method includes selecting a target electronic device based on determining that the gaze direction is directed to the target electronic device. The method includes receiving, via an input device, one or more inputs. The method includes processing the one or more inputs based on the target electronic device.
    Type: Application
    Filed: May 12, 2023
    Publication date: October 19, 2023
    Inventors: Alexis H. Palangie, Aaron M. Burns, Arun Rakesh Yoganandan, Benjamin R. Blachnitzky
  • Publication number: 20230325140
    Abstract: In an exemplary process for interacting with user interface objects using an eye gaze, a placement point for a selected object is designated at a first position based on a gaze position. In response to a user input, the placement point is moved to a second position that is not based on the gaze position, and the object is placed at the second position.
    Type: Application
    Filed: June 14, 2023
    Publication date: October 12, 2023
    Inventors: Avi BAR-ZEEV, Ryan S. BURGOYNE, Devin W. CHALMERS, Luis R. DELIZ CENTENO, Rahul NAIR, Timothy R. ORIOL, Alexis H. PALANGIE
  • Publication number: 20230325003
    Abstract: Methods for displaying selectable options in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, one or more selectable options are displayed in a three-dimensional computer-generated environment in accordance with the determination that the one or more criteria have been satisfied, including a criteria that a hand of the user is oriented in a predetermined manner with respect to an electronic device. In some embodiments, a user is able to perform one-handed actuation of a selectable option with a plurality of user inputs and/or gestures that satisfy one or more activation criteria.
    Type: Application
    Filed: March 10, 2023
    Publication date: October 12, 2023
    Inventors: Nathan GITTER, Jordan A. CAZAMIAS, Alexis H. PALANGIE, Aaron M. BURNS, Benjamin HYLAK
  • Publication number: 20230325004
    Abstract: Methods for interacting with objects and user interface elements in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, a user can directly or indirectly interact with objects. In some embodiments, while performing an indirect manipulation, manipulations of virtual objects are scaled. In some embodiments, while performing a direct manipulation, manipulations of virtual objects are not scaled. In some embodiments, an object can be reconfigured from an indirect manipulation mode into a direct manipulation mode by moving the object to a respective position in the three-dimensional environment in response to a respective gesture.
    Type: Application
    Filed: March 10, 2023
    Publication date: October 12, 2023
    Inventors: Aaron M. BURNS, Alexis H. PALANGIE, Nathan GITTER, Nicolai GEORG, Benjamin R. BLACHNITZKY, Arun Rakesh YOGANANDAN, Benjamin HYLAK, Adam G. POULOS
  • Publication number: 20230315270
    Abstract: Methods for displaying user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can have different immersion levels. In some embodiments, a user interface can have a respective immersion level based on its location in the three-dimensional environment or distance from the user. In some embodiments, a user interface can have a respective immersion level based on the state of the user interface. In some embodiments, a user interface can switch from one immersion level to another in response to the user's interaction with the user interface.
    Type: Application
    Filed: March 10, 2023
    Publication date: October 5, 2023
    Inventors: Benjamin HYLAK, Alexis H. PALANGIE, Jordan A. CAZAMIAS, Nathan GITTER, Aaron M. BURNS