Patents by Inventor Nathan Gitter

Nathan Gitter has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250110551
    Abstract: Some embodiments described in this disclosure are directed to one or more computer systems that display virtual environments associated with a presentation application. In some embodiments, the computer system displays a virtual environment selected by a user of the computer system that simulates a real-world setting in which a presentation would be delivered in.
    Type: Application
    Filed: May 31, 2024
    Publication date: April 3, 2025
    Inventors: Amy W. HUNG, Peter G. BERGER, Zachariah N. PAINE, Daniel H. MAI, Nathan GITTER, Ryan M. OLSHAVSKY, James M. DESSERO, Alan C. DYE, Jonathan P. IVE, Stephen O. LEMAY, William A. SORRENTINO, III, Peter D. ANTON
  • Publication number: 20250093964
    Abstract: Methods for displaying selectable options in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, one or more selectable options are displayed in a three-dimensional computer-generated environment in accordance with the determination that the one or more criteria have been satisfied, including a criteria that a hand of the user is oriented in a predetermined manner with respect to an electronic device. In some embodiments, a user is able to perform one-handed actuation of a selectable option with a plurality of user inputs and/or gestures that satisfy one or more activation criteria.
    Type: Application
    Filed: December 5, 2024
    Publication date: March 20, 2025
    Inventors: Nathan GITTER, Jordan A. CAZAMIAS, Alexis H. PALANGIE, Aaron M. BURNS, Benjamin HYLAK
  • Patent number: 12254127
    Abstract: In accordance with some embodiments, an exemplary process for controlling representations of virtual objects based on respective user contexts that each correspond to different respective locations in a computer-generated reality (CGR) environment is described.
    Type: Grant
    Filed: December 27, 2023
    Date of Patent: March 18, 2025
    Assignee: Apple Inc.
    Inventors: Aaron M. Burns, Nathan Gitter, Alexis H. Palangie, Pol Pla I. Conesa, David M. Schattel
  • Publication number: 20250013344
    Abstract: Some examples of the disclosure are directed to methods for application-based spatial refinement in a multi-user communication session including a first electronic device and a second electronic device. While the first electronic device is presenting a three-dimensional environment, the first electronic device receives an input corresponding to a request to move a shared object in the three-dimensional environment. In accordance with a determination that the shared object is an object of a first type, the first electronic device moves the shared object and an avatar of a user in the three-dimensional environment in accordance with the input. In accordance with a determination that the shared object is an object of a second type, different from the first type, and the input is a first type of input, the first electronic device moves the shared object in the three-dimensional environment in accordance with the input, without moving the avatar.
    Type: Application
    Filed: September 25, 2024
    Publication date: January 9, 2025
    Inventors: Connor A. SMITH, Christopher D. MCKENZIE, Nathan GITTER
  • Publication number: 20240420435
    Abstract: In some embodiments, a computer system facilitates movement, including rotation, of a virtual object in a three-dimensional environment. In some embodiments, a computer system facilitates movement of a virtual object in a three-dimensional environment toward a movement boundary. In some embodiments, a computer system facilitates dynamic scaling of a virtual object in a three-dimensional environment based on movement of the virtual object in the three-dimensional environment. In some embodiments, a computer system facilitates inertial movement of a virtual object in a three-dimensional environment. In some embodiments, a computer system facilitates converging offsets between a portion of a user and a virtual object. In some embodiments, a computer system facilitates rotation of a volumetric virtual object in a three-dimensional environment.
    Type: Application
    Filed: May 17, 2024
    Publication date: December 19, 2024
    Inventors: Nathan GITTER, Benjamin HYLAK, Jonathan RAVASZ, Christopher D. MCKENZIE, Nahckjoon KIM, Israel PASTRANA VICENTE, Zoey C. TAYLOR, Benjamin H. BOESEL
  • Publication number: 20240411421
    Abstract: A computer system detects an input to invoke a home menu user interface. In response to detecting the input, the computer system displays, via one or more display generation components, the home menu user interface in a three-dimensional environment, including: if a viewpoint of a user in the three-dimensional environment had a first elevation relative to a reference plane in the three-dimensional environment, displaying the home menu user interface at a first height in the three-dimensional environment; and, if the viewpoint of the user in the three-dimensional environment had a second elevation relative to the reference plane in the three-dimensional environment, the second elevation being different from the first elevation, displaying the home menu user interface at a second height in the three-dimensional environment, the second height being different from the first height.
    Type: Application
    Filed: May 14, 2024
    Publication date: December 12, 2024
    Inventors: Israel Pastrana Vicente, Amy E. DeDonato, Marcos Alonso Ruiz, Lee S. Broughton, Richard D. Lyons, William A. Sorrentino, III, Stephen O. Lemay, James J. Owen, Miquel Estany Rodriguez, Jesse Chand, Jonathan R. Dascola, Christian Schnorr, Zoey C. Taylor, Jonathan Ravasz, Harlan B. Haskins, Vinay Chawda, Benjamin H. Boesel, Ieyuki Kawashima, Christopher D. McKenzie, Benjamin Hylak, Nathan Gitter, Nahckjoon Kim, Owen Monsma, Matan Stauber, Danielle M. Price
  • Publication number: 20240385858
    Abstract: In some embodiments, a computer system changing a visual appearance of immersive mixed reality (MR) content in a three-dimensional environment in accordance with a respective type of input in accordance with some embodiments of the disclosure. In some embodiments, a computer system facilitates display of immersive MR content in a three-dimensional environment. In some embodiments, a computer system displays content of a respective application in a first mode of operation that includes spatially distributing the content throughout an available display area of a three-dimensional environment and displays an option to cease display of the content in the first mode of operation in accordance with some embodiments of the disclosure.
    Type: Application
    Filed: May 17, 2024
    Publication date: November 21, 2024
    Inventors: Christopher D. MCKENZIE, Benjamin HYLAK, Karen EL ASMAR, Nathan GITTER, Wesley M. HOLDER, Zoey C. TAYLOR
  • Patent number: 12112011
    Abstract: Some examples of the disclosure are directed to methods for application-based spatial refinement in a multi-user communication session including a first electronic device and a second electronic device. While the first electronic device is presenting a three-dimensional environment, the first electronic device receives an input corresponding to a request to move a shared object in the three-dimensional environment. In accordance with a determination that the shared object is an object of a first type, the first electronic device moves the shared object and an avatar of a user in the three-dimensional environment in accordance with the input. In accordance with a determination that the shared object is an object of a second type, different from the first type, and the input is a first type of input, the first electronic device moves the shared object in the three-dimensional environment in accordance with the input, without moving the avatar.
    Type: Grant
    Filed: September 11, 2023
    Date of Patent: October 8, 2024
    Assignee: Apple Inc.
    Inventors: Connor A. Smith, Christopher D. McKenzie, Nathan Gitter
  • Publication number: 20240329797
    Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.
    Type: Application
    Filed: June 10, 2024
    Publication date: October 3, 2024
    Inventors: Benjamin HYLAK, Aaron M. BURNS, Nathan GITTER, Jordan A. CAZAMIAS, Alexis H. PALANGIE, James J. OWEN
  • Publication number: 20240302948
    Abstract: Methods for displaying user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can have different immersion levels. In some embodiments, a user interface can have a respective immersion level based on its location in the three-dimensional environment or distance from the user. In some embodiments, a user interface can have a respective immersion level based on the state of the user interface. In some embodiments, a user interface can switch from one immersion level to another in response to the user's interaction with the user interface.
    Type: Application
    Filed: May 17, 2024
    Publication date: September 12, 2024
    Inventors: Benjamin HYLAK, Alexis H. PALANGIE, Jordan A. CAZAMIAS, Nathan GITTER, Aaron M. BURNS
  • Publication number: 20240273838
    Abstract: Some embodiments of the disclosure are directed to an augmented representation of a first electronic device. A three-dimensional representation of a first electronic device (e.g., an augmented device) is presented using a second electronic device in a three-dimensional environment. The three-dimensional environment includes captured portions of a real-world environment, optionally including the first electronic device. The augmented representation of the first electronic device includes a virtual user interface element representing an extension of the physical display of the first electronic device. The representation of the augmented device includes a display of the augmented user interface. The augmented device is optionally configured to display some or all of the user interfaces operating on the first electronic device. Manipulations of and/or interactions with the augmented representation of the first electronic device are possible.
    Type: Application
    Filed: February 26, 2024
    Publication date: August 15, 2024
    Inventors: Alexis H. PALANGIE, Benjamin HYLAK, Aaron M. BURNS, Nathan GITTER
  • Publication number: 20240272722
    Abstract: Displaying and manipulating user interface elements in a computer-generated environment is disclosed. In some embodiments, a user is able to use a pinch and hold gesture to seamlessly and efficiently display and isolate a slider and then manipulate that slider without having to modify the pinch and hold gesture. In some embodiments, gaze data can be used to coarsely identify a focus element, and hand movement can then be used for fine identification of the focus element.
    Type: Application
    Filed: February 26, 2024
    Publication date: August 15, 2024
    Inventors: Nathan GITTER, Alexis H. PALANGIE, Aaron M. BURNS, Benjamin HYLAK
  • Patent number: 12039142
    Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.
    Type: Grant
    Filed: December 25, 2022
    Date of Patent: July 16, 2024
    Assignee: Apple Inc.
    Inventors: Benjamin Hylak, Aaron M. Burns, Nathan Gitter, Jordan A. Cazamias, Alexis H. Palangie
  • Publication number: 20240203276
    Abstract: In one implementation, a method of providing audience feedback during a performance of a presentation is performed at a device including a display, one or more processors, and non-transitory memory. The method includes displaying, on the display in association with an environment including a plurality of audience members, one or more slides of a presentation. The method includes, while displaying the one or more slides of the presentation, obtaining data regarding the plurality of audience members. The method includes displaying, on the display in association with the environment, one or more virtual objects based on the data regarding the plurality of audience members.
    Type: Application
    Filed: April 6, 2022
    Publication date: June 20, 2024
    Inventors: Benjamin Hylak, Aaron M. Burns, Grant H. Mulliken, Mary A. Pyc, Nathan Gitter, Pau Sastre Miguel, Steven A. Marchette
  • Publication number: 20240193858
    Abstract: In one implementation, a method of assisting in the rehearsal of a presentation is performed at a device including a display, one or more processors, and non-transitory memory. The method includes obtaining a difficulty level for a rehearsal of a presentation. The method includes displaying, on the display, one or more slides of the presentation. The method includes displaying, on the display in association with a volumetric environment, one or more virtual objects based on the difficulty level.
    Type: Application
    Filed: April 11, 2022
    Publication date: June 13, 2024
    Inventors: Benjamin Hylak, Aaron M. Burns, Grant H. Mulliken, Mary A. Pyc, Nathan Gitter, Pau Sastre Miguel, Steven A. Marchette
  • Patent number: 11995301
    Abstract: Methods for displaying user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can have different immersion levels. In some embodiments, a user interface can have a respective immersion level based on its location in the three-dimensional environment or distance from the user. In some embodiments, a user interface can have a respective immersion level based on the state of the user interface. In some embodiments, a user interface can switch from one immersion level to another in response to the user's interaction with the user interface.
    Type: Grant
    Filed: March 10, 2023
    Date of Patent: May 28, 2024
    Assignee: Apple Inc.
    Inventors: Benjamin Hylak, Alexis H. Palangie, Jordan A. Cazamias, Nathan Gitter, Aaron M. Burns
  • Publication number: 20240152256
    Abstract: A computer system concurrently displays, via a display generation component, a browser toolbar, for a browser that includes a plurality of tabs and a window including first content associated with a first tab of the plurality of tabs. The browser toolbar and the window are overlaying a view of a three-dimensional environment. While displaying the browser toolbar and the window that includes the first content overlaying the view of the three-dimensional environment, the computer system detects a first air gesture that meets first gesture criteria, the air gesture comprising a gaze input directed at a location in the view of the three-dimensional environment that is occupied by the browser toolbar and a hand movement. In response to detecting the first air gesture that meets the first gesture criteria, the computer system displays second content in the window, the second content associated with a second tab of the plurality of tabs.
    Type: Application
    Filed: September 18, 2023
    Publication date: May 9, 2024
    Inventors: Jonathan R. Dascola, Nathan Gitter, Jay Moon, Stephen O. Lemay, Joseph M.W. Luxton, Angel Suet Y. Cheung, Danielle M. Price, Hugo D. Verweij, Kristi E.S. Bauerly, Katherine W. Kolombatovich, Jordan A. Cazamias
  • Publication number: 20240152244
    Abstract: While displaying an application user interface, a device detects a first input to an input device of the one or more input devices, the input device provided on a housing of the device that includes the one or more display generation components. In response to detecting the first input, the device replaces display of at least a portion of the application user interface by displaying a home menu user interface via the one or more display generation components. While displaying the home menu user interface, the device detects a second input to the input device provided on the housing of the device; and in response to detecting the second input to the input device provided on the housing of the device: the device dismisses the home menu user interface.
    Type: Application
    Filed: September 18, 2023
    Publication date: May 9, 2024
    Inventors: Amy E. DeDonato, Israel Pastrana Vicente, Nathan Gitter, Christopher D. McKenzie, Stephen O. Lemay, Zoey C. Taylor, Vitalii Kramar, Benjamin Hylak, Sanket S. Dave, Deepak Iyer, Lauren A. Hastings, Madhur Ahuja, Natalia A. Fornshell, Christopher J. Romney, Joaquim Goncola Lobo Ferreira da Silva, Shawna M. Spain
  • Publication number: 20240152245
    Abstract: A computer system displays a first object that includes at least a first portion of the first object and a second portion of the first object and detects a first gaze input that meets first criteria, wherein the first criteria require that the first gaze input is directed to the first portion of the first object in order for the first criteria to be met. In response, the computer system displays a first control element that corresponds to a first operation associated with the first object, wherein the first control element was not displayed prior to detecting that the first gaze input met the first criteria, and detects a first user input directed to the first control element. In response to detecting the first user input directed to the first control element, the computer system performs the first operation with respect to the first object.
    Type: Application
    Filed: September 21, 2023
    Publication date: May 9, 2024
    Inventors: Lee S. Broughton, Israel Pastrana Vicente, Matan Stauber, Miquel Estany Rodriguez, James J. Owen, Jonathan R. Dascola, Stephen O. Lemay, Christian Schnorr, Zoey C. Taylor, Jay Moon, Benjamin H. Boesel, Benjamin Hylak, Richard D. Lyons, Willliam A. Sorrentino, III, Lynn I. Streja, Jonathan Ravasz, Nathan Gitter, Peter D. Anton, Michael J. Rockwell, Peter L. Hajas, Evgenii Krivoruchko, Mark A. Ebbole, James Magahern, Andrew J. Sawyer, Christopher D. McKenzie, Michael E. Buerli, Olivier D. R. Gutknecht
  • Patent number: D1068807
    Type: Grant
    Filed: June 4, 2023
    Date of Patent: April 1, 2025
    Assignee: Apple Inc.
    Inventors: Jesse Chand, Jonathan R. Dascola, Nathan Gitter, Stephen O. Lemay, Richard D. Lyons, Israel Pastrana Vicente, Lorena S. Pazmino, William A. Sorrentino, Matan Stauber