Patents by Inventor Benjamin Hylak

Benjamin Hylak has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11995301
    Abstract: Methods for displaying user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can have different immersion levels. In some embodiments, a user interface can have a respective immersion level based on its location in the three-dimensional environment or distance from the user. In some embodiments, a user interface can have a respective immersion level based on the state of the user interface. In some embodiments, a user interface can switch from one immersion level to another in response to the user's interaction with the user interface.
    Type: Grant
    Filed: March 10, 2023
    Date of Patent: May 28, 2024
    Assignee: Apple Inc.
    Inventors: Benjamin Hylak, Alexis H. Palangie, Jordan A. Cazamias, Nathan Gitter, Aaron M. Burns
  • Publication number: 20240152244
    Abstract: While displaying an application user interface, a device detects a first input to an input device of the one or more input devices, the input device provided on a housing of the device that includes the one or more display generation components. In response to detecting the first input, the device replaces display of at least a portion of the application user interface by displaying a home menu user interface via the one or more display generation components. While displaying the home menu user interface, the device detects a second input to the input device provided on the housing of the device; and in response to detecting the second input to the input device provided on the housing of the device: the device dismisses the home menu user interface.
    Type: Application
    Filed: September 18, 2023
    Publication date: May 9, 2024
    Inventors: Amy E. DeDonato, Israel Pastrana Vicente, Nathan Gitter, Christopher D. McKenzie, Stephen O. Lemay, Zoey C. Taylor, Vitalii Kramar, Benjamin Hylak, Sanket S. Dave, Deepak Iyer, Lauren A. Hastings, Madhur Ahuja, Natalia A. Fornshell, Christopher J. Romney, Joaquim Goncola Lobo Ferreira da Silva, Shawna M. Spain
  • Publication number: 20240152245
    Abstract: A computer system displays a first object that includes at least a first portion of the first object and a second portion of the first object and detects a first gaze input that meets first criteria, wherein the first criteria require that the first gaze input is directed to the first portion of the first object in order for the first criteria to be met. In response, the computer system displays a first control element that corresponds to a first operation associated with the first object, wherein the first control element was not displayed prior to detecting that the first gaze input met the first criteria, and detects a first user input directed to the first control element. In response to detecting the first user input directed to the first control element, the computer system performs the first operation with respect to the first object.
    Type: Application
    Filed: September 21, 2023
    Publication date: May 9, 2024
    Inventors: Lee S. Broughton, Israel Pastrana Vicente, Matan Stauber, Miquel Estany Rodriguez, James J. Owen, Jonathan R. Dascola, Stephen O. Lemay, Christian Schnorr, Zoey C. Taylor, Jay Moon, Benjamin H. Boesel, Benjamin Hylak, Richard D. Lyons, Willliam A. Sorrentino, III, Lynn I. Streja, Jonathan Ravasz, Nathan Gitter, Peter D. Anton, Michael J. Rockwell, Peter L. Hajas, Evgenii Krivoruchko, Mark A. Ebbole, James Magahern, Andrew J. Sawyer, Christopher D. McKenzie, Michael E. Buerli, Olivier D. R. Gutknecht
  • Patent number: 11960657
    Abstract: A method includes, while displaying a computer-generated object at a first position within an environment, obtaining extremity tracking data from an extremity tracker. The first position is outside of a drop region that is viewable using the display. The method includes moving the computer-generated object from the first position to a second position within the environment based on the extremity tracking data. The method includes, in response to determining that the second position satisfies a proximity threshold with respect to the drop region, detecting an input that is associated with a spatial region of the environment. The method includes moving the computer-generated object from the second position to a third position that is within the drop region, based on determining that the spatial region satisfies a focus criterion associated with the drop region.
    Type: Grant
    Filed: March 21, 2023
    Date of Patent: April 16, 2024
    Inventors: Aaron M. Burns, Adam G. Poulos, Arun Rakesh Yoganandan, Benjamin Hylak, Benjamin R. Blachnitzky, Jordan A. Cazamias, Nicolai Georg
  • Publication number: 20240103682
    Abstract: A computer system displays a first application user interface at a first location in a three-dimensional environment. While displaying the first application user interface at the first location in the three-dimensional environment, the computer system detects, at a first time, a first input corresponding to a request to close the first application user interface. In response to detecting the first input corresponding to a request to close the first application user interface: the computer system closes the first application user interface, including ceasing to display the first application user interface in the three-dimensional environment; and, in accordance with a determination that respective criteria are met, the computer system displays a home menu user interface at a respective home menu position that is determined based on the first location of the first application user interface in the three-dimensional environment.
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Stephen O. Lemay, Zoey C. Taylor, Benjamin Hylak, Willliam A. Sorrentino, III, Jonathan Ravasz, Peter D. Anton, Michael J. Rockwell
  • Publication number: 20240104843
    Abstract: In some embodiments, a computer system facilitates depth conflict mitigation for a virtual object that is in contact with one or more physical objects in a three-dimensional environment by reducing visual prominence of one or more portions of the virtual object. In some embodiments, a computer system adjusts the visibility of one or more virtual objects in a three-dimensional environment by applying a visual effect to the one or more virtual objects in response to detecting one or more portions of a user. In some embodiments, a computer system modifies visual prominence in accordance with a level of engagement with a virtual object.
    Type: Application
    Filed: September 22, 2023
    Publication date: March 28, 2024
    Inventors: Christopher D. MCKENZIE, Benjamin HYLAK, Conner J. BROOKS, Adrian P. LINDBERG, Bryce L. SCHMIDTCHEN
  • Publication number: 20240104860
    Abstract: A computer system, in response to detecting a first input on a rotatable input mechanism, in accordance with a determination that the first input is a first type of input: changes an immersion level associated with display of an extended reality (XR) environment generated by the display generation component to a first immersion level in which display of the XR environment concurrently includes virtual content from an application and a passthrough portion of a physical environment of the computer system. In accordance with a determination that the first input is a second type of input: the computer system performs an operation different from changing the immersion level associated with display of the XR environment.
    Type: Application
    Filed: September 18, 2023
    Publication date: March 28, 2024
    Inventors: Christopher D. McKenzie, Stephen O. Lemay, Zoey C. Taylor, Vitalii Kramar, Benjamin Hylak
  • Publication number: 20240094862
    Abstract: A computer system displays, in a three-dimensional environment, a computer-generated object; detects that a user's attention is directed to the object; and in response, displays a virtual shadow for the object with a first appearance, including displaying the shadow with a first value for a first visual property, while maintaining a pose of the object relative to the three-dimensional environment. While continuing to display the object in the three-dimensional environment, the computer system detects that the user's attention has ceased to be directed to the object; and in response, displays the shadow for the object with a second appearance that is different from the first appearance, while maintaining the pose of the object relative to the three-dimensional environment. Displaying the shadow for the object with the second appearance includes displaying the shadow with a second value for the first visual property. The second value is different from the first value.
    Type: Application
    Filed: September 22, 2022
    Publication date: March 21, 2024
    Inventors: James M. Dessero, Miquel Rodriguez Estany, Benjamin Hylak
  • Publication number: 20240086031
    Abstract: Methods for displaying and organizing user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can be grouped together into a container. In some embodiments, user interfaces can be added to a container, removed from a container, or moved from one location in the container to another location. In some embodiments, a visual indication is displayed before a user interface is added to a container. In some embodiments, a user interface can replace an existing user interface in a container. In some embodiments, when moving a user interface in a computer-generated environment, the transparency of a user interface that is obscured can be modified.
    Type: Application
    Filed: November 20, 2023
    Publication date: March 14, 2024
    Inventors: Alexis H. PALANGIE, Aaron M. BURNS, Benjamin HYLAK
  • Publication number: 20240087256
    Abstract: In some embodiments, a computer system facilitates depth conflict mitigation for a virtual object that is in contact with one or more physical objects in a three-dimensional environment by changing visual properties of one or more portions of the virtual object. In some embodiments, a computer system facilitates depth conflict mitigation for a virtual object that is in contact with one or more physical objects in a three-dimensional environment by displaying the virtual object in a virtual environment within the three-dimensional environment. In some embodiments, a computer system facilitates depth conflict mitigation for a virtual object that is in contact with one or more physical objects in a three-dimensional environment by variably changing visual properties of one or more portions of the virtual object and/or by variably displaying the virtual object in a virtual environment based on one or more characteristics of the depth conflict.
    Type: Application
    Filed: September 14, 2023
    Publication date: March 14, 2024
    Inventors: Benjamin HYLAK, William A. SORRENTINO, III, Christopher D. MCKENZIE, James J. OWEN, Zoey C. TAYLOR, Miquel ESTANY RODRIGUEZ
  • Publication number: 20240045579
    Abstract: A representation displayed in a three-dimensional environment may be selected with different types of selection inputs. When a representation displayed in the three-dimensional environment is selected, an application corresponding to the representation may be launched in the three-dimensional environment in accordance with the type of selection received. In such embodiments, when the representation is selected with a first type of selection, the application corresponding to the selected representation may be launched in a first manner, and when the representation is selected with a second type of selection, the application corresponding to the selected representation may be launched in a second manner. In some embodiments, the manner in which the application is launched may determine whether one or more previously launched applications continue to be displayed or cease to be displayed in the three-dimensional environment.
    Type: Application
    Filed: December 21, 2021
    Publication date: February 8, 2024
    Inventors: Nathan GITTER, Aaron M. BURNS, Benjamin HYLAK, Jonathan R. DASCOLA, Alexis H. PALANGIE
  • Publication number: 20230350537
    Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.
    Type: Application
    Filed: December 25, 2022
    Publication date: November 2, 2023
    Inventors: Benjamin HYLAK, Aaron M. BURNS, Nathan GITTER, Jordan A. CAZAMIAS, Alexis H. PALANGIE, James J. OWEN
  • Publication number: 20230333651
    Abstract: A method is performed at an electronic device with one or more processors, a non-transitory memory, a display, an extremity tracking system, and a communication interface provided to communicate with a finger-wearable device. The method includes displaying a computer-generated object on the display. The method includes obtaining finger manipulation data from the finger-wearable device via the communication interface. The method includes determining a multi-finger gesture based on extremity tracking data from the extremity tracking system and the finger manipulation data. The method includes registering an engagement event with respect to the computer-generated object according to the multi-finger gesture.
    Type: Application
    Filed: March 20, 2023
    Publication date: October 19, 2023
    Inventors: Aaron M. Burns, Adam G. Poulos, Arun Rakesh Yoganandan, Benjamin Hylak, Benjamin R. Blachnitzky, Nicolai Georg
  • Publication number: 20230333650
    Abstract: A method is performed at an electronic device with one or more processors, a non-transitory memory, a display, and a communication interface provided to communicate with a finger-wearable device. The method includes displaying first instructional content that is associated with a first gesture. The first instructional content includes a first object. The method includes determining an engagement score that characterizes a level of user engagement with respect to the first object. The method includes obtaining finger manipulation data from the finger-wearable device via the communication interface. The method includes determining that the finger-wearable device performs the first gesture based on a function of the finger manipulation data.
    Type: Application
    Filed: February 27, 2023
    Publication date: October 19, 2023
    Inventors: Benjamin Hylak, Aaron M. Burns, Adam G. Poulos, Arun Rakesh Yoganandan, Benjamin R. Blachnitzky, Nicolai Georg
  • Publication number: 20230325047
    Abstract: A method includes displaying a plurality of computer-generated objects, including a first computer-generated object at a first position within an environment and a second computer-generated object at a second position within the environment. The first computer-generated object corresponds to a first user interface element that includes a first set of controls for modifying a content item. The method includes, while displaying the plurality of computer-generated objects, obtaining extremity tracking data. The method includes moving the first computer-generated object from the first position to a third position within the environment based on the extremity tracking data. The method includes, in accordance with a determination that the third position satisfies a proximity threshold with respect to the second position, merging the first computer-generated object with the second computer-generated object in order to generate a third computer-generated object for modifying the content item.
    Type: Application
    Filed: March 15, 2023
    Publication date: October 12, 2023
    Inventors: Nicolai Georg, Aaron M. Burns, Adam G. Poulos, Arun Rakesh Yoganandan, Benjamin Hylak, Benjamin R. Blachnitzky
  • Publication number: 20230325004
    Abstract: Methods for interacting with objects and user interface elements in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, a user can directly or indirectly interact with objects. In some embodiments, while performing an indirect manipulation, manipulations of virtual objects are scaled. In some embodiments, while performing a direct manipulation, manipulations of virtual objects are not scaled. In some embodiments, an object can be reconfigured from an indirect manipulation mode into a direct manipulation mode by moving the object to a respective position in the three-dimensional environment in response to a respective gesture.
    Type: Application
    Filed: March 10, 2023
    Publication date: October 12, 2023
    Inventors: Aaron M. BURNS, Alexis H. PALANGIE, Nathan GITTER, Nicolai GEORG, Benjamin R. BLACHNITZKY, Arun Rakesh YOGANANDAN, Benjamin HYLAK, Adam G. POULOS
  • Publication number: 20230325003
    Abstract: Methods for displaying selectable options in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, one or more selectable options are displayed in a three-dimensional computer-generated environment in accordance with the determination that the one or more criteria have been satisfied, including a criteria that a hand of the user is oriented in a predetermined manner with respect to an electronic device. In some embodiments, a user is able to perform one-handed actuation of a selectable option with a plurality of user inputs and/or gestures that satisfy one or more activation criteria.
    Type: Application
    Filed: March 10, 2023
    Publication date: October 12, 2023
    Inventors: Nathan GITTER, Jordan A. CAZAMIAS, Alexis H. PALANGIE, Aaron M. BURNS, Benjamin HYLAK
  • Publication number: 20230315270
    Abstract: Methods for displaying user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can have different immersion levels. In some embodiments, a user interface can have a respective immersion level based on its location in the three-dimensional environment or distance from the user. In some embodiments, a user interface can have a respective immersion level based on the state of the user interface. In some embodiments, a user interface can switch from one immersion level to another in response to the user's interaction with the user interface.
    Type: Application
    Filed: March 10, 2023
    Publication date: October 5, 2023
    Inventors: Benjamin HYLAK, Alexis H. PALANGIE, Jordan A. CAZAMIAS, Nathan GITTER, Aaron M. BURNS
  • Publication number: 20230297168
    Abstract: A method is performed at an electronic device with one or more processors, a non-transitory memory, a display, and a communication interface provided to communicate with a finger-wearable device. The method includes, while displaying a plurality of content items on the display, obtaining finger manipulation data from the finger-wearable device via the communication interface. The method includes selecting a first one of the plurality of content items based on a first portion of the finger manipulation data in combination with satisfaction of a proximity threshold. The first one of the plurality of content items is associated with a first dimensional representation. The method includes changing the first one of the plurality of content items from the first dimensional representation to a second dimensional representation based on a second portion of the finger manipulation data.
    Type: Application
    Filed: March 15, 2023
    Publication date: September 21, 2023
    Inventors: Nicolai Georg, Aaron M. Burns, Adam G. Poulos, Arun Rakesh Yoganandan, Benjamin Hylak, Benjamin R. Blachnitzky
  • Publication number: 20230297172
    Abstract: A method includes, while displaying a computer-generated object at a first position within an environment, obtaining extremity tracking data from an extremity tracker. The first position is outside of a drop region that is viewable using the display. The method includes moving the computer-generated object from the first position to a second position within the environment based on the extremity tracking data. The method includes, in response to determining that the second position satisfies a proximity threshold with respect to the drop region, detecting an input that is associated with a spatial region of the environment. The method includes moving the computer-generated object from the second position to a third position that is within the drop region, based on determining that the spatial region satisfies a focus criterion associated with the drop region.
    Type: Application
    Filed: March 21, 2023
    Publication date: September 21, 2023
    Inventors: Aaron M. Burns, Adam G. Poulos, Arun Rakesh Yoganandan, Benjamin Hylak, Benjamin R. Blachnitzky, Jordan A. Cazamias, Nicolai Georg