Patents by Inventor Benjamin Hylak
Benjamin Hylak has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12373081Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.Type: GrantFiled: June 10, 2024Date of Patent: July 29, 2025Assignee: Apple Inc.Inventors: Benjamin Hylak, Aaron M. Burns, Nathan Gitter, Jordan A Cazamias, Alexis H. Palangie, James J. Owen
-
Patent number: 12321563Abstract: Methods for displaying and organizing user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can be grouped together into a container. In some embodiments, user interfaces can be added to a container, removed from a container, or moved from one location in the container to another location. In some embodiments, a visual indication is displayed before a user interface is added to a container. In some embodiments, a user interface can replace an existing user interface in a container. In some embodiments, when moving a user interface in a computer-generated environment, the transparency of a user interface that is obscured can be modified.Type: GrantFiled: November 20, 2023Date of Patent: June 3, 2025Assignee: Apple Inc.Inventors: Alexis H. Palangie, Aaron M. Burns, Benjamin Hylak
-
Publication number: 20250165070Abstract: A method is performed at an electronic device with one or more processors, a non-transitory memory, a display, an extremity tracking system, and a communication interface provided to communicate with a finger-wearable device. The method includes displaying a computer-generated object on the display. The method includes obtaining finger manipulation data from the finger-wearable device via the communication interface. The method includes determining a multi-finger gesture based on extremity tracking data from the extremity tracking system and the finger manipulation data. The method includes registering an engagement event with respect to the computer-generated object according to the multi-finger gesture.Type: ApplicationFiled: January 17, 2025Publication date: May 22, 2025Inventors: Aaron M. Burns, Adam G. Poulos, Arun Rakesh Yoganandan, Benjamin Hylak, Benjamin R. Blachnitzky, Nicolai Georg
-
Patent number: 12299267Abstract: Methods for displaying user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can have different immersion levels. In some embodiments, a user interface can have a respective immersion level based on its location in the three-dimensional environment or distance from the user. In some embodiments, a user interface can have a respective immersion level based on the state of the user interface. In some embodiments, a user interface can switch from one immersion level to another in response to the user's interaction with the user interface.Type: GrantFiled: May 17, 2024Date of Patent: May 13, 2025Assignee: Apple Inc.Inventors: Benjamin Hylak, Alexis H. Palangie, Jordan A. Cazamias, Nathan Gitter, Aaron M. Burns
-
Publication number: 20250118038Abstract: In some embodiments, a computer system changes a visual prominence of a respective virtual object in response to detecting a threshold amount of overlap between a first virtual object and a second virtual object. In some embodiments, a computer system changes a visual prominence of a respective virtual object based on a change in spatial location of a first virtual object with respect to a second virtual object. In some embodiments, a computer system applies visual effects to representations of physical objects, virtual environments, and/or physical environments. In some embodiments, a computer system changes a visual prominence of a virtual object relative to a three-dimensional environment based on display of overlapping objects of different types in the three-dimensional environment. In some embodiments, a computer system changes a level of opacity of a first virtual object overlapping a second virtual object in response to movement of the first virtual object.Type: ApplicationFiled: December 19, 2024Publication date: April 10, 2025Inventors: William A. SORRENTINO, III, Benjamin HYLAK, Christopher D. MCKENZIE, Stephen O. LEMAY, Zoey C. TAYLOR, Miquel ESTANY RODRIGUEZ, James J. OWEN
-
Publication number: 20250093964Abstract: Methods for displaying selectable options in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, one or more selectable options are displayed in a three-dimensional computer-generated environment in accordance with the determination that the one or more criteria have been satisfied, including a criteria that a hand of the user is oriented in a predetermined manner with respect to an electronic device. In some embodiments, a user is able to perform one-handed actuation of a selectable option with a plurality of user inputs and/or gestures that satisfy one or more activation criteria.Type: ApplicationFiled: December 5, 2024Publication date: March 20, 2025Inventors: Nathan GITTER, Jordan A. CAZAMIAS, Alexis H. PALANGIE, Aaron M. BURNS, Benjamin HYLAK
-
Publication number: 20250078420Abstract: In some embodiments, a computer system changes a visual prominence of a respective virtual object in response to detecting a threshold amount of overlap between a first virtual object and a second virtual object. In some embodiments, a computer system changes a visual prominence of a respective virtual object based on a change in spatial location of a first virtual object with respect to a second virtual object. In some embodiments, a computer system applies visual effects to representations of physical objects, virtual environments, and/or physical environments. In some embodiments, a computer system changes a visual prominence of a virtual object relative to a three-dimensional environment based on display of overlapping objects of different types in the three-dimensional environment. In some embodiments, a computer system changes a level of opacity of a first virtual object overlapping a second virtual object in response to movement of the first virtual object.Type: ApplicationFiled: June 4, 2024Publication date: March 6, 2025Inventors: James M. DESSERO, Benjamin HYLAK, Christopher D. MCKENZIE, Jeffrey S. ALLEN, William A. SORRENTINO, III, Katherine W. KOLOMBATOVICH, Matthew G. PLEC
-
Patent number: 12242668Abstract: A method is performed at an electronic device with one or more processors, a non-transitory memory, a display, an extremity tracking system, and a communication interface provided to communicate with a finger-wearable device. The method includes displaying a computer-generated object on the display. The method includes obtaining finger manipulation data from the finger-wearable device via the communication interface. The method includes determining a multi-finger gesture based on extremity tracking data from the extremity tracking system and the finger manipulation data. The method includes registering an engagement event with respect to the computer-generated object according to the multi-finger gesture.Type: GrantFiled: March 20, 2023Date of Patent: March 4, 2025Assignee: APPLE INC.Inventors: Aaron M. Burns, Adam G. Poulos, Arun Rakesh Yoganandan, Benjamin Hylak, Benjamin R. Blachnitzky, Nicolai Georg
-
Publication number: 20250028423Abstract: In some embodiments, an electronic device changes the immersion level of a virtual environment and/or spatial effect in a three-dimensional environment based on the geometry of the physical environment around the device. In some embodiments, an electronic device modifies the virtual environment and/or spatial effect in response to detecting a movement of the device. In some embodiments, an electronic device moves a user interface of an application into and/or out of a virtual environment. In some embodiments, an electronic device selectively changes the display of a simulated environment and/or atmospheric effect in a three-dimensional environment based on movement of an object associated with a viewpoint of a user. In some embodiments, an electronic device provides feedback to a user in response to a user moving a virtual object to and/or into a simulated environment.Type: ApplicationFiled: October 7, 2024Publication date: January 23, 2025Inventors: James M. DESSERO, Benjamin HYLAK, William A. SORRENTINO, III, Stephen O. LEMAY, Katherine W. KOLOMBATOVICH
-
Publication number: 20240420435Abstract: In some embodiments, a computer system facilitates movement, including rotation, of a virtual object in a three-dimensional environment. In some embodiments, a computer system facilitates movement of a virtual object in a three-dimensional environment toward a movement boundary. In some embodiments, a computer system facilitates dynamic scaling of a virtual object in a three-dimensional environment based on movement of the virtual object in the three-dimensional environment. In some embodiments, a computer system facilitates inertial movement of a virtual object in a three-dimensional environment. In some embodiments, a computer system facilitates converging offsets between a portion of a user and a virtual object. In some embodiments, a computer system facilitates rotation of a volumetric virtual object in a three-dimensional environment.Type: ApplicationFiled: May 17, 2024Publication date: December 19, 2024Inventors: Nathan GITTER, Benjamin HYLAK, Jonathan RAVASZ, Christopher D. MCKENZIE, Nahckjoon KIM, Israel PASTRANA VICENTE, Zoey C. TAYLOR, Benjamin H. BOESEL
-
Publication number: 20240411421Abstract: A computer system detects an input to invoke a home menu user interface. In response to detecting the input, the computer system displays, via one or more display generation components, the home menu user interface in a three-dimensional environment, including: if a viewpoint of a user in the three-dimensional environment had a first elevation relative to a reference plane in the three-dimensional environment, displaying the home menu user interface at a first height in the three-dimensional environment; and, if the viewpoint of the user in the three-dimensional environment had a second elevation relative to the reference plane in the three-dimensional environment, the second elevation being different from the first elevation, displaying the home menu user interface at a second height in the three-dimensional environment, the second height being different from the first height.Type: ApplicationFiled: May 14, 2024Publication date: December 12, 2024Inventors: Israel Pastrana Vicente, Amy E. DeDonato, Marcos Alonso Ruiz, Lee S. Broughton, Richard D. Lyons, William A. Sorrentino, III, Stephen O. Lemay, James J. Owen, Miquel Estany Rodriguez, Jesse Chand, Jonathan R. Dascola, Christian Schnorr, Zoey C. Taylor, Jonathan Ravasz, Harlan B. Haskins, Vinay Chawda, Benjamin H. Boesel, Ieyuki Kawashima, Christopher D. McKenzie, Benjamin Hylak, Nathan Gitter, Nahckjoon Kim, Owen Monsma, Matan Stauber, Danielle M. Price
-
Publication number: 20240385858Abstract: In some embodiments, a computer system changing a visual appearance of immersive mixed reality (MR) content in a three-dimensional environment in accordance with a respective type of input in accordance with some embodiments of the disclosure. In some embodiments, a computer system facilitates display of immersive MR content in a three-dimensional environment. In some embodiments, a computer system displays content of a respective application in a first mode of operation that includes spatially distributing the content throughout an available display area of a three-dimensional environment and displays an option to cease display of the content in the first mode of operation in accordance with some embodiments of the disclosure.Type: ApplicationFiled: May 17, 2024Publication date: November 21, 2024Inventors: Christopher D. MCKENZIE, Benjamin HYLAK, Karen EL ASMAR, Nathan GITTER, Wesley M. HOLDER, Zoey C. TAYLOR
-
Publication number: 20240361835Abstract: In some embodiments, a computer system selectively recenters virtual content to a viewpoint of a user. In some embodiments, a computer system selectively recenters virtual content and/or gathers the virtual content. In some embodiments, a computer system presents virtual content with a first spatial arrangement when an input corresponds to a recentering operation and presents the virtual content with a second spatial arrangement when the input corresponds to a gathering operation.Type: ApplicationFiled: April 24, 2024Publication date: October 31, 2024Inventors: Benjamin HYLAK, Benjamin H. BOESEL, Danielle M. PRICE, Stephen O. LEMAY, Zoey C. TAYLOR
-
Patent number: 12112009Abstract: In some embodiments, an electronic device changes the immersion level of a virtual environment and/or spatial effect in a three-dimensional environment based on the geometry of the physical environment around the device. In some embodiments, an electronic device modifies the virtual environment and/or spatial effect in response to detecting a movement of the device. In some embodiments, an electronic device moves a user interface of an application into and/or out of a virtual environment. In some embodiments, an electronic device selectively changes the display of a simulated environment and/or atmospheric effect in a three-dimensional environment based on movement of an object associated with a viewpoint of a user. In some embodiments, an electronic device provides feedback to a user in response to a user moving a virtual object to and/or into a simulated environment.Type: GrantFiled: April 13, 2022Date of Patent: October 8, 2024Assignee: Apple Inc.Inventors: James M. Dessero, Benjamin Hylak, William A. Sorrentino, III, Stephen O. Lemay, Ieyuki Kawashima, Katherine W. Kolombatovich, Jeffrey S. Allen
-
Publication number: 20240329797Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.Type: ApplicationFiled: June 10, 2024Publication date: October 3, 2024Inventors: Benjamin HYLAK, Aaron M. BURNS, Nathan GITTER, Jordan A. CAZAMIAS, Alexis H. PALANGIE, James J. OWEN
-
Publication number: 20240302948Abstract: Methods for displaying user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can have different immersion levels. In some embodiments, a user interface can have a respective immersion level based on its location in the three-dimensional environment or distance from the user. In some embodiments, a user interface can have a respective immersion level based on the state of the user interface. In some embodiments, a user interface can switch from one immersion level to another in response to the user's interaction with the user interface.Type: ApplicationFiled: May 17, 2024Publication date: September 12, 2024Inventors: Benjamin HYLAK, Alexis H. PALANGIE, Jordan A. CAZAMIAS, Nathan GITTER, Aaron M. BURNS
-
Publication number: 20240273838Abstract: Some embodiments of the disclosure are directed to an augmented representation of a first electronic device. A three-dimensional representation of a first electronic device (e.g., an augmented device) is presented using a second electronic device in a three-dimensional environment. The three-dimensional environment includes captured portions of a real-world environment, optionally including the first electronic device. The augmented representation of the first electronic device includes a virtual user interface element representing an extension of the physical display of the first electronic device. The representation of the augmented device includes a display of the augmented user interface. The augmented device is optionally configured to display some or all of the user interfaces operating on the first electronic device. Manipulations of and/or interactions with the augmented representation of the first electronic device are possible.Type: ApplicationFiled: February 26, 2024Publication date: August 15, 2024Inventors: Alexis H. PALANGIE, Benjamin HYLAK, Aaron M. BURNS, Nathan GITTER
-
Publication number: 20240272722Abstract: Displaying and manipulating user interface elements in a computer-generated environment is disclosed. In some embodiments, a user is able to use a pinch and hold gesture to seamlessly and efficiently display and isolate a slider and then manipulate that slider without having to modify the pinch and hold gesture. In some embodiments, gaze data can be used to coarsely identify a focus element, and hand movement can then be used for fine identification of the focus element.Type: ApplicationFiled: February 26, 2024Publication date: August 15, 2024Inventors: Nathan GITTER, Alexis H. PALANGIE, Aaron M. BURNS, Benjamin HYLAK
-
Patent number: 12039142Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.Type: GrantFiled: December 25, 2022Date of Patent: July 16, 2024Assignee: Apple Inc.Inventors: Benjamin Hylak, Aaron M. Burns, Nathan Gitter, Jordan A. Cazamias, Alexis H. Palangie
-
Publication number: 20240203276Abstract: In one implementation, a method of providing audience feedback during a performance of a presentation is performed at a device including a display, one or more processors, and non-transitory memory. The method includes displaying, on the display in association with an environment including a plurality of audience members, one or more slides of a presentation. The method includes, while displaying the one or more slides of the presentation, obtaining data regarding the plurality of audience members. The method includes displaying, on the display in association with the environment, one or more virtual objects based on the data regarding the plurality of audience members.Type: ApplicationFiled: April 6, 2022Publication date: June 20, 2024Inventors: Benjamin Hylak, Aaron M. Burns, Grant H. Mulliken, Mary A. Pyc, Nathan Gitter, Pau Sastre Miguel, Steven A. Marchette