Patents by Inventor Alexis M. Burns

Alexis M. Burns has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12164739
    Abstract: In some embodiments, an electronic device enhances interactions with virtual objects in a three-dimensional environment. In some embodiments, an electronic device enhances interactions with selectable user interface elements. In some embodiments, an electronic device enhances interactions with slider user interface elements. In some embodiments, an electronic device moves virtual objects in a three-dimensional environment and facilitates accessing actions associated with virtual objects.
    Type: Grant
    Filed: September 25, 2021
    Date of Patent: December 10, 2024
    Assignee: Apple Inc.
    Inventors: Israel Pastrana Vicente, Jonathan R. Dascola, Wesley M. Holder, Pol Pla I Conesa, Alexis Henri Palangie, Aaron Mackay Burns, William A. Sorrentino, III, Stephen O. Lemay, Christopher D. McKenzie, Shih-Sang Chiu, Benjamin Hunter Boesel, Jonathan Ravasz
  • Publication number: 20240329797
    Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.
    Type: Application
    Filed: June 10, 2024
    Publication date: October 3, 2024
    Inventors: Benjamin HYLAK, Aaron M. BURNS, Nathan GITTER, Jordan A. CAZAMIAS, Alexis H. PALANGIE, James J. OWEN
  • Publication number: 20240312073
    Abstract: In one implementation, a method of resolving focal conflict in a computer-generated reality (CGR) environment is performed by a device including a processor, non-transitory memory, an image sensor, and a display. The method includes capturing, using the image sensor, an image of a scene including a real object in a particular direction at a first distance from the device. The method includes displaying, on the display, a CGR environment including a virtual object in the particular direction at a second distance from the device. In accordance with a determination that the second distance is less than the first distance, the CGR environment includes the virtual object overlaid on the scene. In accordance with a determination that the second distance is greater than the first distance, the CGR environment includes the virtual object with an obfuscation area that obfuscates at least a portion of the real object within the obfuscation area.
    Type: Application
    Filed: May 23, 2024
    Publication date: September 19, 2024
    Inventors: Alexis Henri Palangie, Shih Sang Chiu, Bruno M. Sommer, Connor Alexander Smith, Aaron Mackay Burns
  • Publication number: 20240302948
    Abstract: Methods for displaying user interfaces in a computer-generated environment provide for an efficient and intuitive user experience. In some embodiments, user interfaces can have different immersion levels. In some embodiments, a user interface can have a respective immersion level based on its location in the three-dimensional environment or distance from the user. In some embodiments, a user interface can have a respective immersion level based on the state of the user interface. In some embodiments, a user interface can switch from one immersion level to another in response to the user's interaction with the user interface.
    Type: Application
    Filed: May 17, 2024
    Publication date: September 12, 2024
    Inventors: Benjamin HYLAK, Alexis H. PALANGIE, Jordan A. CAZAMIAS, Nathan GITTER, Aaron M. BURNS
  • Publication number: 20240295882
    Abstract: The present disclosure provides methods, apparatuses, systems, and computer-readable mediums for classifying a region and an intensity of a contact. A method includes obtaining, from a plurality of acoustic sensors provided on an inner surface of a bumper of the apparatus, a combined acoustic signal, the combined acoustic signal being based on an input signal provided to the plurality of acoustic sensors, determining, using a trained machine learning model that has been trained with acoustic signals and corresponding position and intensity information of impacts on a plurality of regions of an outer surface of the bumper, the region of the contact on the bumper with respect to the plurality of regions and the intensity of the contact, based on the combined acoustic signal, and determining a motion of the apparatus based on the region of the contact, the intensity of the contact, and an operating mode of the apparatus.
    Type: Application
    Filed: December 15, 2023
    Publication date: September 5, 2024
    Applicant: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Adarsh K. KOSTA, Alexis M. Burns, Caleb Escobedo, Siddharth Rupavatharam, Richard E. Howard, Lawrence Jackel, Daewon Lee, Ibrahim Volken Isler
  • Publication number: 20240273838
    Abstract: Some embodiments of the disclosure are directed to an augmented representation of a first electronic device. A three-dimensional representation of a first electronic device (e.g., an augmented device) is presented using a second electronic device in a three-dimensional environment. The three-dimensional environment includes captured portions of a real-world environment, optionally including the first electronic device. The augmented representation of the first electronic device includes a virtual user interface element representing an extension of the physical display of the first electronic device. The representation of the augmented device includes a display of the augmented user interface. The augmented device is optionally configured to display some or all of the user interfaces operating on the first electronic device. Manipulations of and/or interactions with the augmented representation of the first electronic device are possible.
    Type: Application
    Filed: February 26, 2024
    Publication date: August 15, 2024
    Inventors: Alexis H. PALANGIE, Benjamin HYLAK, Aaron M. BURNS, Nathan GITTER
  • Publication number: 20240272722
    Abstract: Displaying and manipulating user interface elements in a computer-generated environment is disclosed. In some embodiments, a user is able to use a pinch and hold gesture to seamlessly and efficiently display and isolate a slider and then manipulate that slider without having to modify the pinch and hold gesture. In some embodiments, gaze data can be used to coarsely identify a focus element, and hand movement can then be used for fine identification of the focus element.
    Type: Application
    Filed: February 26, 2024
    Publication date: August 15, 2024
    Inventors: Nathan GITTER, Alexis H. PALANGIE, Aaron M. BURNS, Benjamin HYLAK
  • Patent number: 12039142
    Abstract: Devices, methods, and graphical interfaces for content applications displayed in an XR environment provide for an efficient and intuitive user experience. In some embodiments, a content application is displayed in a three-dimensional computer-generated environment. In some embodiments, different viewing modes and user interfaces are available for a content application in a three-dimensional computer-generated environment. In some embodiments, different interactions are available with content items displayed in the XR environment.
    Type: Grant
    Filed: December 25, 2022
    Date of Patent: July 16, 2024
    Assignee: Apple Inc.
    Inventors: Benjamin Hylak, Aaron M. Burns, Nathan Gitter, Jordan A. Cazamias, Alexis H. Palangie