Patents by Inventor Nathan Gitter

Nathan Gitter has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240152244
    Abstract: While displaying an application user interface, a device detects a first input to an input device of the one or more input devices, the input device provided on a housing of the device that includes the one or more display generation components. In response to detecting the first input, the device replaces display of at least a portion of the application user interface by displaying a home menu user interface via the one or more display generation components. While displaying the home menu user interface, the device detects a second input to the input device provided on the housing of the device; and in response to detecting the second input to the input device provided on the housing of the device: the device dismisses the home menu user interface.
    Type: Application
    Filed: September 18, 2023
    Publication date: May 9, 2024
    Inventors: Amy E. DeDonato, Israel Pastrana Vicente, Nathan Gitter, Christopher D. McKenzie, Stephen O. Lemay, Zoey C. Taylor, Vitalii Kramar, Benjamin Hylak, Sanket S. Dave, Deepak Iyer, Lauren A. Hastings, Madhur Ahuja, Natalia A. Fornshell, Christopher J. Romney, Joaquim Goncola Lobo Ferreira da Silva, Shawna M. Spain
  • Publication number: 20240152245
    Abstract: A computer system displays a first object that includes at least a first portion of the first object and a second portion of the first object and detects a first gaze input that meets first criteria, wherein the first criteria require that the first gaze input is directed to the first portion of the first object in order for the first criteria to be met. In response, the computer system displays a first control element that corresponds to a first operation associated with the first object, wherein the first control element was not displayed prior to detecting that the first gaze input met the first criteria, and detects a first user input directed to the first control element. In response to detecting the first user input directed to the first control element, the computer system performs the first operation with respect to the first object.
    Type: Application
    Filed: September 21, 2023
    Publication date: May 9, 2024
    Inventors: Lee S. Broughton, Israel Pastrana Vicente, Matan Stauber, Miquel Estany Rodriguez, James J. Owen, Jonathan R. Dascola, Stephen O. Lemay, Christian Schnorr, Zoey C. Taylor, Jay Moon, Benjamin H. Boesel, Benjamin Hylak, Richard D. Lyons, Willliam A. Sorrentino, III, Lynn I. Streja, Jonathan Ravasz, Nathan Gitter, Peter D. Anton, Michael J. Rockwell, Peter L. Hajas, Evgenii Krivoruchko, Mark A. Ebbole, James Magahern, Andrew J. Sawyer, Christopher D. McKenzie, Michael E. Buerli, Olivier D. R. Gutknecht
  • Publication number: 20240152256
    Abstract: A computer system concurrently displays, via a display generation component, a browser toolbar, for a browser that includes a plurality of tabs and a window including first content associated with a first tab of the plurality of tabs. The browser toolbar and the window are overlaying a view of a three-dimensional environment. While displaying the browser toolbar and the window that includes the first content overlaying the view of the three-dimensional environment, the computer system detects a first air gesture that meets first gesture criteria, the air gesture comprising a gaze input directed at a location in the view of the three-dimensional environment that is occupied by the browser toolbar and a hand movement. In response to detecting the first air gesture that meets the first gesture criteria, the computer system displays second content in the window, the second content associated with a second tab of the plurality of tabs.
    Type: Application
    Filed: September 18, 2023
    Publication date: May 9, 2024
    Inventors: Jonathan R. Dascola, Nathan Gitter, Jay Moon, Stephen O. Lemay, Joseph M.W. Luxton, Angel Suet Y. Cheung, Danielle M. Price, Hugo D. Verweij, Kristi E.S. Bauerly, Katherine W. Kolombatovich, Jordan A. Cazamias
  • Publication number: 20240126362
    Abstract: In accordance with some embodiments, an exemplary process for controlling representations of virtual objects based on respective user contexts that each correspond to different respective locations in a computer-generated reality (CGR) environment is described.
    Type: Application
    Filed: December 27, 2023
    Publication date: April 18, 2024
    Inventors: Aaron M. BURNS, Nathan GITTER, Alexis H. PALANGIE, Pol PLA I. CONESA, David M. SCHATTEL
  • Publication number: 20240104861
    Abstract: While displaying an application user interface, in response to detecting a first input to an input device a computer system, in accordance with a determination that the application user interface is in a first mode of display, wherein the first mode of display includes an immersive mode in which only content of the application user interface is displayed, displays via the display generation component the application user interface in a second mode of display, wherein the second mode of display includes a non-immersive mode in which respective content of the application user interface and other content are concurrently displayed, and in accordance with a determination that the application user interface is in the second mode of display, the computer system replaces display of at least a portion of the application user interface by displaying a home menu user interface via the display generation component.
    Type: Application
    Filed: September 18, 2023
    Publication date: March 28, 2024
    Inventors: Amy E. DeDonato, Israel Pastrana Vicente, Nathan Gitter, Stephen O. Lemay, Zoey C. Taylor
  • Publication number: 20240094863
    Abstract: Some examples of the disclosure are directed to methods for application-based spatial refinement in a multi-user communication session including a first electronic device and a second electronic device. While the first electronic device is presenting a three-dimensional environment, the first electronic device receives an input corresponding to a request to move a shared object in the three-dimensional environment. In accordance with a determination that the shared object is an object of a first type, the first electronic device moves the shared object and an avatar of a user in the three-dimensional environment in accordance with the input. In accordance with a determination that the shared object is an object of a second type, different from the first type, and the input is a first type of input, the first electronic device moves the shared object in the three-dimensional environment in accordance with the input, without moving the avatar.
    Type: Application
    Filed: September 11, 2023
    Publication date: March 21, 2024
    Inventors: Connor A. SMITH, Christopher D. MCKENZIE, Nathan GITTER
  • Publication number: 20240045579
    Abstract: A representation displayed in a three-dimensional environment may be selected with different types of selection inputs. When a representation displayed in the three-dimensional environment is selected, an application corresponding to the representation may be launched in the three-dimensional environment in accordance with the type of selection received. In such embodiments, when the representation is selected with a first type of selection, the application corresponding to the selected representation may be launched in a first manner, and when the representation is selected with a second type of selection, the application corresponding to the selected representation may be launched in a second manner. In some embodiments, the manner in which the application is launched may determine whether one or more previously launched applications continue to be displayed or cease to be displayed in the three-dimensional environment.
    Type: Application
    Filed: December 21, 2021
    Publication date: February 8, 2024
    Inventors: Nathan GITTER, Aaron M. BURNS, Benjamin HYLAK, Jonathan R. DASCOLA, Alexis H. PALANGIE
  • Publication number: 20240029371
    Abstract: Various implementations disclosed herein include devices, systems, and methods that enable presenting environments including visual representations of multiple applications. In one implementation, a method includes presenting a view of an environment at an electronic device on a display. The view includes visual representations of a plurality of applications. The method further includes determining to provide a first application with access to a control parameter. The control parameter is configured to modify at least a portion of the view of the environment with virtual content, and the portion of the view includes at least a portion of content outside of a view of a visual representation associated with the first application. The method further includes restricting access to the control parameter by other applications which prevents the other applications from modifying the at least the portion of the view of the environment via the control parameter.
    Type: Application
    Filed: September 29, 2023
    Publication date: January 25, 2024
    Inventors: Aaron M Burns, Alexis H. Palangie, Nathan Gitter, Pol Pla I. Conesa
  • Patent number: 11861056
    Abstract: In accordance with some embodiments, an exemplary process for controlling representations of virtual objects based on respective user contexts that each correspond to different respective locations in a computer-generated reality (CGR) environment is described.
    Type: Grant
    Filed: August 6, 2021
    Date of Patent: January 2, 2024
    Assignee: Apple Inc.
    Inventors: Aaron M. Burns, Nathan Gitter, Alexis H. Palangie, Pol Pla I Conesa, David M. Schattel
  • Publication number: 20230350537
    Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.
    Type: Application
    Filed: December 25, 2022
    Publication date: November 2, 2023
    Inventors: Benjamin HYLAK, Aaron M. BURNS, Nathan GITTER, Jordan A. CAZAMIAS, Alexis H. PALANGIE, James J. OWEN
  • Publication number: 20230325003
    Abstract: Methods for displaying selectable options in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, one or more selectable options are displayed in a three-dimensional computer-generated environment in accordance with the determination that the one or more criteria have been satisfied, including a criteria that a hand of the user is oriented in a predetermined manner with respect to an electronic device. In some embodiments, a user is able to perform one-handed actuation of a selectable option with a plurality of user inputs and/or gestures that satisfy one or more activation criteria.
    Type: Application
    Filed: March 10, 2023
    Publication date: October 12, 2023
    Inventors: Nathan GITTER, Jordan A. CAZAMIAS, Alexis H. PALANGIE, Aaron M. BURNS, Benjamin HYLAK
  • Publication number: 20230325004
    Abstract: Methods for interacting with objects and user interface elements in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, a user can directly or indirectly interact with objects. In some embodiments, while performing an indirect manipulation, manipulations of virtual objects are scaled. In some embodiments, while performing a direct manipulation, manipulations of virtual objects are not scaled. In some embodiments, an object can be reconfigured from an indirect manipulation mode into a direct manipulation mode by moving the object to a respective position in the three-dimensional environment in response to a respective gesture.
    Type: Application
    Filed: March 10, 2023
    Publication date: October 12, 2023
    Inventors: Aaron M. BURNS, Alexis H. PALANGIE, Nathan GITTER, Nicolai GEORG, Benjamin R. BLACHNITZKY, Arun Rakesh YOGANANDAN, Benjamin HYLAK, Adam G. POULOS
  • Publication number: 20230315270
    Abstract: Methods for displaying user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can have different immersion levels. In some embodiments, a user interface can have a respective immersion level based on its location in the three-dimensional environment or distance from the user. In some embodiments, a user interface can have a respective immersion level based on the state of the user interface. In some embodiments, a user interface can switch from one immersion level to another in response to the user's interaction with the user interface.
    Type: Application
    Filed: March 10, 2023
    Publication date: October 5, 2023
    Inventors: Benjamin HYLAK, Alexis H. PALANGIE, Jordan A. CAZAMIAS, Nathan GITTER, Aaron M. BURNS
  • Publication number: 20230316634
    Abstract: In some embodiments, a computer system selectively recenters virtual content to a viewpoint of a user, in the presence of physical or virtual obstacles, and/or automatically recenters one or more virtual objects in response to the display generation component changing state, selectively recenters content associated with a communication session between multiple users in response detected user input, changes the visual prominence of content included in virtual objects based on viewpoint and/or based on a detected user attention of a user, modifies visual prominence of one or more virtual objects to resolve apparent obscuring of the one or more virtual objects, modifies visual prominence based on user viewpoint relative to virtual objects, concurrently modifies visual prominence based various types of user interaction, and/or changes an amount of visual impact of an environmental effect in response to detected user input.
    Type: Application
    Filed: January 19, 2023
    Publication date: October 5, 2023
    Inventors: Shih-Sang CHIU, Benjamin H. BOESEL, Jonathan PERRON, Stephen O. LEMAY, Christopher D. MCKENZIE, Dorian D. DARGAN, Jonathan RAVASZ, Nathan GITTER
  • Patent number: 11776225
    Abstract: Various implementations disclosed herein include devices, systems, and methods that enable presenting environments including visual representations of multiple applications. In one implementation, a method includes presenting a view of an environment at an electronic device on a display. The view includes visual representations of a plurality of applications. The method further includes determining to provide a first application with access to a control parameter. The control parameter is configured to modify at least a portion of the view of the environment with virtual content, and the portion of the view includes at least a portion of content outside of a view of a visual representation associated with the first application. The method further includes restricting access to the control parameter by other applications which prevents the other applications from modifying the at least the portion of the view of the environment via the control parameter.
    Type: Grant
    Filed: April 29, 2022
    Date of Patent: October 3, 2023
    Assignee: Apple Inc.
    Inventors: Aaron M. Burns, Alexis H. Palangie, Nathan Gitter, Pol Pla I. Conesa
  • Publication number: 20230152935
    Abstract: In some embodiments, an electronic device updates the spatial arrangement of one or more virtual objects in a three-dimensional environment. In some embodiments, an electronic device updates the positions of multiple virtual objects together. In some embodiments, an electronic device displays objects in a three-dimensional environment based on an estimated location of a floor in the three-dimensional environment. In some embodiments, an electronic device moves (e.g., repositions) objects in a three-dimensional environment.
    Type: Application
    Filed: September 16, 2022
    Publication date: May 18, 2023
    Inventors: Christopher D. MCKENZIE, Nathan GITTER, Alexis H. PALANGIE, Shih-Sang CHIU, Benjamin H. BOESEL, Dorian D. DARGAN, Zoey C. Taylor
  • Publication number: 20220254127
    Abstract: Various implementations disclosed herein include devices, systems, and methods that enable presenting environments including visual representations of multiple applications. In one implementation, a method includes presenting a view of an environment at an electronic device on a display. The view includes visual representations of a plurality of applications. The method further includes determining to provide a first application with access to a control parameter. The control parameter is configured to modify at least a portion of the view of the environment with virtual content, and the portion of the view includes at least a portion of content outside of a view of a visual representation associated with the first application. The method further includes restricting access to the control parameter by other applications which prevents the other applications from modifying the at least the portion of the view of the environment via the control parameter.
    Type: Application
    Filed: April 29, 2022
    Publication date: August 11, 2022
    Inventors: Aaron M. Burns, Alexis H. Palangie, Nathan Gitter, Pol Pla I. Conesa
  • Patent number: 11354867
    Abstract: Various implementations disclosed herein include devices, systems, and methods that enable presenting environments comprising visual representations of multiple applications. In one implementation, a method includes presenting a view of an environment at an electronic device on a display of the electronic device. The environment comprising visual representations corresponding to a plurality of applications. A first application among the plurality of applications is designated as an elevated application. The elevated application is provided with access to a control parameter configured to modify an ambience of the environment. Other applications of the plurality of applications are restricted from accessing the control parameter while the first application is designated as the elevated application.
    Type: Grant
    Filed: February 17, 2021
    Date of Patent: June 7, 2022
    Assignee: Apple Inc.
    Inventors: Aaron M. Burns, Alexis H. Palangie, Nathan Gitter, Pol Pla I. Conesa
  • Publication number: 20210365108
    Abstract: In accordance with some embodiments, an exemplary process for controlling representations of virtual objects based on respective user contexts that each correspond to different respective locations in a computer-generated reality (CGR) environment is described.
    Type: Application
    Filed: August 6, 2021
    Publication date: November 25, 2021
    Inventors: Aaron Mackay BURNS, Nathan GITTER, Alexis H. PALANGIE, Pol PLA I CONESA, David M. SCHATTEL
  • Publication number: 20210279966
    Abstract: Various implementations disclosed herein include devices, systems, and methods that enable presenting environments comprising visual representations of multiple applications. In one implementation, a method includes presenting a view of an environment at an electronic device on a display of the electronic device. The environment comprising visual representations corresponding to a plurality of applications. A first application among the plurality of applications is designated as an elevated application. The elevated application is provided with access to a control parameter configured to modify an ambience of the environment. Other applications of the plurality of applications are restricted from accessing the control parameter while the first application is designated as the elevated application.
    Type: Application
    Filed: February 17, 2021
    Publication date: September 9, 2021
    Inventors: Aaron M. Burns, Alexis H. Palangie, Nathan Gitter, Pol Pla I. Conesa