Patents by Inventor Benjamin Hylak

Benjamin Hylak has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12373081
    Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.
    Type: Grant
    Filed: June 10, 2024
    Date of Patent: July 29, 2025
    Assignee: Apple Inc.
    Inventors: Benjamin Hylak, Aaron M. Burns, Nathan Gitter, Jordan A Cazamias, Alexis H. Palangie, James J. Owen
  • Patent number: 12321563
    Abstract: Methods for displaying and organizing user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can be grouped together into a container. In some embodiments, user interfaces can be added to a container, removed from a container, or moved from one location in the container to another location. In some embodiments, a visual indication is displayed before a user interface is added to a container. In some embodiments, a user interface can replace an existing user interface in a container. In some embodiments, when moving a user interface in a computer-generated environment, the transparency of a user interface that is obscured can be modified.
    Type: Grant
    Filed: November 20, 2023
    Date of Patent: June 3, 2025
    Assignee: Apple Inc.
    Inventors: Alexis H. Palangie, Aaron M. Burns, Benjamin Hylak
  • Publication number: 20250165070
    Abstract: A method is performed at an electronic device with one or more processors, a non-transitory memory, a display, an extremity tracking system, and a communication interface provided to communicate with a finger-wearable device. The method includes displaying a computer-generated object on the display. The method includes obtaining finger manipulation data from the finger-wearable device via the communication interface. The method includes determining a multi-finger gesture based on extremity tracking data from the extremity tracking system and the finger manipulation data. The method includes registering an engagement event with respect to the computer-generated object according to the multi-finger gesture.
    Type: Application
    Filed: January 17, 2025
    Publication date: May 22, 2025
    Inventors: Aaron M. Burns, Adam G. Poulos, Arun Rakesh Yoganandan, Benjamin Hylak, Benjamin R. Blachnitzky, Nicolai Georg
  • Patent number: 12299267
    Abstract: Methods for displaying user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can have different immersion levels. In some embodiments, a user interface can have a respective immersion level based on its location in the three-dimensional environment or distance from the user. In some embodiments, a user interface can have a respective immersion level based on the state of the user interface. In some embodiments, a user interface can switch from one immersion level to another in response to the user's interaction with the user interface.
    Type: Grant
    Filed: May 17, 2024
    Date of Patent: May 13, 2025
    Assignee: Apple Inc.
    Inventors: Benjamin Hylak, Alexis H. Palangie, Jordan A. Cazamias, Nathan Gitter, Aaron M. Burns
  • Publication number: 20250118038
    Abstract: In some embodiments, a computer system changes a visual prominence of a respective virtual object in response to detecting a threshold amount of overlap between a first virtual object and a second virtual object. In some embodiments, a computer system changes a visual prominence of a respective virtual object based on a change in spatial location of a first virtual object with respect to a second virtual object. In some embodiments, a computer system applies visual effects to representations of physical objects, virtual environments, and/or physical environments. In some embodiments, a computer system changes a visual prominence of a virtual object relative to a three-dimensional environment based on display of overlapping objects of different types in the three-dimensional environment. In some embodiments, a computer system changes a level of opacity of a first virtual object overlapping a second virtual object in response to movement of the first virtual object.
    Type: Application
    Filed: December 19, 2024
    Publication date: April 10, 2025
    Inventors: William A. SORRENTINO, III, Benjamin HYLAK, Christopher D. MCKENZIE, Stephen O. LEMAY, Zoey C. TAYLOR, Miquel ESTANY RODRIGUEZ, James J. OWEN
  • Publication number: 20250093964
    Abstract: Methods for displaying selectable options in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, one or more selectable options are displayed in a three-dimensional computer-generated environment in accordance with the determination that the one or more criteria have been satisfied, including a criteria that a hand of the user is oriented in a predetermined manner with respect to an electronic device. In some embodiments, a user is able to perform one-handed actuation of a selectable option with a plurality of user inputs and/or gestures that satisfy one or more activation criteria.
    Type: Application
    Filed: December 5, 2024
    Publication date: March 20, 2025
    Inventors: Nathan GITTER, Jordan A. CAZAMIAS, Alexis H. PALANGIE, Aaron M. BURNS, Benjamin HYLAK
  • Publication number: 20250078420
    Abstract: In some embodiments, a computer system changes a visual prominence of a respective virtual object in response to detecting a threshold amount of overlap between a first virtual object and a second virtual object. In some embodiments, a computer system changes a visual prominence of a respective virtual object based on a change in spatial location of a first virtual object with respect to a second virtual object. In some embodiments, a computer system applies visual effects to representations of physical objects, virtual environments, and/or physical environments. In some embodiments, a computer system changes a visual prominence of a virtual object relative to a three-dimensional environment based on display of overlapping objects of different types in the three-dimensional environment. In some embodiments, a computer system changes a level of opacity of a first virtual object overlapping a second virtual object in response to movement of the first virtual object.
    Type: Application
    Filed: June 4, 2024
    Publication date: March 6, 2025
    Inventors: James M. DESSERO, Benjamin HYLAK, Christopher D. MCKENZIE, Jeffrey S. ALLEN, William A. SORRENTINO, III, Katherine W. KOLOMBATOVICH, Matthew G. PLEC
  • Patent number: 12242668
    Abstract: A method is performed at an electronic device with one or more processors, a non-transitory memory, a display, an extremity tracking system, and a communication interface provided to communicate with a finger-wearable device. The method includes displaying a computer-generated object on the display. The method includes obtaining finger manipulation data from the finger-wearable device via the communication interface. The method includes determining a multi-finger gesture based on extremity tracking data from the extremity tracking system and the finger manipulation data. The method includes registering an engagement event with respect to the computer-generated object according to the multi-finger gesture.
    Type: Grant
    Filed: March 20, 2023
    Date of Patent: March 4, 2025
    Assignee: APPLE INC.
    Inventors: Aaron M. Burns, Adam G. Poulos, Arun Rakesh Yoganandan, Benjamin Hylak, Benjamin R. Blachnitzky, Nicolai Georg
  • Publication number: 20250028423
    Abstract: In some embodiments, an electronic device changes the immersion level of a virtual environment and/or spatial effect in a three-dimensional environment based on the geometry of the physical environment around the device. In some embodiments, an electronic device modifies the virtual environment and/or spatial effect in response to detecting a movement of the device. In some embodiments, an electronic device moves a user interface of an application into and/or out of a virtual environment. In some embodiments, an electronic device selectively changes the display of a simulated environment and/or atmospheric effect in a three-dimensional environment based on movement of an object associated with a viewpoint of a user. In some embodiments, an electronic device provides feedback to a user in response to a user moving a virtual object to and/or into a simulated environment.
    Type: Application
    Filed: October 7, 2024
    Publication date: January 23, 2025
    Inventors: James M. DESSERO, Benjamin HYLAK, William A. SORRENTINO, III, Stephen O. LEMAY, Katherine W. KOLOMBATOVICH
  • Publication number: 20240420435
    Abstract: In some embodiments, a computer system facilitates movement, including rotation, of a virtual object in a three-dimensional environment. In some embodiments, a computer system facilitates movement of a virtual object in a three-dimensional environment toward a movement boundary. In some embodiments, a computer system facilitates dynamic scaling of a virtual object in a three-dimensional environment based on movement of the virtual object in the three-dimensional environment. In some embodiments, a computer system facilitates inertial movement of a virtual object in a three-dimensional environment. In some embodiments, a computer system facilitates converging offsets between a portion of a user and a virtual object. In some embodiments, a computer system facilitates rotation of a volumetric virtual object in a three-dimensional environment.
    Type: Application
    Filed: May 17, 2024
    Publication date: December 19, 2024
    Inventors: Nathan GITTER, Benjamin HYLAK, Jonathan RAVASZ, Christopher D. MCKENZIE, Nahckjoon KIM, Israel PASTRANA VICENTE, Zoey C. TAYLOR, Benjamin H. BOESEL
  • Publication number: 20240411421
    Abstract: A computer system detects an input to invoke a home menu user interface. In response to detecting the input, the computer system displays, via one or more display generation components, the home menu user interface in a three-dimensional environment, including: if a viewpoint of a user in the three-dimensional environment had a first elevation relative to a reference plane in the three-dimensional environment, displaying the home menu user interface at a first height in the three-dimensional environment; and, if the viewpoint of the user in the three-dimensional environment had a second elevation relative to the reference plane in the three-dimensional environment, the second elevation being different from the first elevation, displaying the home menu user interface at a second height in the three-dimensional environment, the second height being different from the first height.
    Type: Application
    Filed: May 14, 2024
    Publication date: December 12, 2024
    Inventors: Israel Pastrana Vicente, Amy E. DeDonato, Marcos Alonso Ruiz, Lee S. Broughton, Richard D. Lyons, William A. Sorrentino, III, Stephen O. Lemay, James J. Owen, Miquel Estany Rodriguez, Jesse Chand, Jonathan R. Dascola, Christian Schnorr, Zoey C. Taylor, Jonathan Ravasz, Harlan B. Haskins, Vinay Chawda, Benjamin H. Boesel, Ieyuki Kawashima, Christopher D. McKenzie, Benjamin Hylak, Nathan Gitter, Nahckjoon Kim, Owen Monsma, Matan Stauber, Danielle M. Price
  • Publication number: 20240385858
    Abstract: In some embodiments, a computer system changing a visual appearance of immersive mixed reality (MR) content in a three-dimensional environment in accordance with a respective type of input in accordance with some embodiments of the disclosure. In some embodiments, a computer system facilitates display of immersive MR content in a three-dimensional environment. In some embodiments, a computer system displays content of a respective application in a first mode of operation that includes spatially distributing the content throughout an available display area of a three-dimensional environment and displays an option to cease display of the content in the first mode of operation in accordance with some embodiments of the disclosure.
    Type: Application
    Filed: May 17, 2024
    Publication date: November 21, 2024
    Inventors: Christopher D. MCKENZIE, Benjamin HYLAK, Karen EL ASMAR, Nathan GITTER, Wesley M. HOLDER, Zoey C. TAYLOR
  • Publication number: 20240361835
    Abstract: In some embodiments, a computer system selectively recenters virtual content to a viewpoint of a user. In some embodiments, a computer system selectively recenters virtual content and/or gathers the virtual content. In some embodiments, a computer system presents virtual content with a first spatial arrangement when an input corresponds to a recentering operation and presents the virtual content with a second spatial arrangement when the input corresponds to a gathering operation.
    Type: Application
    Filed: April 24, 2024
    Publication date: October 31, 2024
    Inventors: Benjamin HYLAK, Benjamin H. BOESEL, Danielle M. PRICE, Stephen O. LEMAY, Zoey C. TAYLOR
  • Patent number: 12112009
    Abstract: In some embodiments, an electronic device changes the immersion level of a virtual environment and/or spatial effect in a three-dimensional environment based on the geometry of the physical environment around the device. In some embodiments, an electronic device modifies the virtual environment and/or spatial effect in response to detecting a movement of the device. In some embodiments, an electronic device moves a user interface of an application into and/or out of a virtual environment. In some embodiments, an electronic device selectively changes the display of a simulated environment and/or atmospheric effect in a three-dimensional environment based on movement of an object associated with a viewpoint of a user. In some embodiments, an electronic device provides feedback to a user in response to a user moving a virtual object to and/or into a simulated environment.
    Type: Grant
    Filed: April 13, 2022
    Date of Patent: October 8, 2024
    Assignee: Apple Inc.
    Inventors: James M. Dessero, Benjamin Hylak, William A. Sorrentino, III, Stephen O. Lemay, Ieyuki Kawashima, Katherine W. Kolombatovich, Jeffrey S. Allen
  • Publication number: 20240329797
    Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.
    Type: Application
    Filed: June 10, 2024
    Publication date: October 3, 2024
    Inventors: Benjamin HYLAK, Aaron M. BURNS, Nathan GITTER, Jordan A. CAZAMIAS, Alexis H. PALANGIE, James J. OWEN
  • Publication number: 20240302948
    Abstract: Methods for displaying user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can have different immersion levels. In some embodiments, a user interface can have a respective immersion level based on its location in the three-dimensional environment or distance from the user. In some embodiments, a user interface can have a respective immersion level based on the state of the user interface. In some embodiments, a user interface can switch from one immersion level to another in response to the user's interaction with the user interface.
    Type: Application
    Filed: May 17, 2024
    Publication date: September 12, 2024
    Inventors: Benjamin HYLAK, Alexis H. PALANGIE, Jordan A. CAZAMIAS, Nathan GITTER, Aaron M. BURNS
  • Publication number: 20240273838
    Abstract: Some embodiments of the disclosure are directed to an augmented representation of a first electronic device. A three-dimensional representation of a first electronic device (e.g., an augmented device) is presented using a second electronic device in a three-dimensional environment. The three-dimensional environment includes captured portions of a real-world environment, optionally including the first electronic device. The augmented representation of the first electronic device includes a virtual user interface element representing an extension of the physical display of the first electronic device. The representation of the augmented device includes a display of the augmented user interface. The augmented device is optionally configured to display some or all of the user interfaces operating on the first electronic device. Manipulations of and/or interactions with the augmented representation of the first electronic device are possible.
    Type: Application
    Filed: February 26, 2024
    Publication date: August 15, 2024
    Inventors: Alexis H. PALANGIE, Benjamin HYLAK, Aaron M. BURNS, Nathan GITTER
  • Publication number: 20240272722
    Abstract: Displaying and manipulating user interface elements in a computer-generated environment is disclosed. In some embodiments, a user is able to use a pinch and hold gesture to seamlessly and efficiently display and isolate a slider and then manipulate that slider without having to modify the pinch and hold gesture. In some embodiments, gaze data can be used to coarsely identify a focus element, and hand movement can then be used for fine identification of the focus element.
    Type: Application
    Filed: February 26, 2024
    Publication date: August 15, 2024
    Inventors: Nathan GITTER, Alexis H. PALANGIE, Aaron M. BURNS, Benjamin HYLAK
  • Patent number: 12039142
    Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.
    Type: Grant
    Filed: December 25, 2022
    Date of Patent: July 16, 2024
    Assignee: Apple Inc.
    Inventors: Benjamin Hylak, Aaron M. Burns, Nathan Gitter, Jordan A. Cazamias, Alexis H. Palangie
  • Publication number: 20240203276
    Abstract: In one implementation, a method of providing audience feedback during a performance of a presentation is performed at a device including a display, one or more processors, and non-transitory memory. The method includes displaying, on the display in association with an environment including a plurality of audience members, one or more slides of a presentation. The method includes, while displaying the one or more slides of the presentation, obtaining data regarding the plurality of audience members. The method includes displaying, on the display in association with the environment, one or more virtual objects based on the data regarding the plurality of audience members.
    Type: Application
    Filed: April 6, 2022
    Publication date: June 20, 2024
    Inventors: Benjamin Hylak, Aaron M. Burns, Grant H. Mulliken, Mary A. Pyc, Nathan Gitter, Pau Sastre Miguel, Steven A. Marchette