Patents by Inventor Tyler R. CALDERONE

Tyler R. CALDERONE has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250149038
    Abstract: This relates generally to intelligent automated assistants and, more specifically, to provide a handsfree notification management system. An example method includes displaying one or more notifications by the electronic device. In response to displaying the one or more notifications, the method includes detecting a visual interaction of a user with the one or more notifications, identifying a notification from the one or more notifications based on the visual interaction, determining a time-interval of the visual interaction with the notification, determining one or more actions associated with the notification based on the time-interval of the visual interaction with the notification, and performing the one or more actions.
    Type: Application
    Filed: January 8, 2025
    Publication date: May 8, 2025
    Inventors: Andrea Valentina SIMES, Daniel W. LOO, Felicia W. EDWARDS, Harry M. SIMMONDS, Tyler R. CALDERONE, Kevin D. PITOLIN, Lorena S. PAZMINO
  • Publication number: 20250138771
    Abstract: Implementations of the subject technology provide a small, portable physical object for use in an extended reality system. The system may include a device that allows a user/wearer to interact with virtual representations of content such as stored data and/or applications overlaid on the user's physical environment. The object has a unique identifier that is obtainable by the device or another device. The object may be an inactive device without any internal circuitry, a passive device with circuitry that is activated by another device, or an active device having its own processing circuitry and/or a display housed in the body of the object. The object can be associated with content displayed by the device to provide a user with a physical object that can be moved or manipulated to move, modify, transport, or store the content generated, stored, and/or displayed in an extended reality environment.
    Type: Application
    Filed: January 2, 2025
    Publication date: May 1, 2025
    Inventors: Samuel L. IGLESIAS, Michael E. BUERLI, Tyler R. CALDERONE, Andrew P. RICHARDSON
  • Publication number: 20240361832
    Abstract: Some examples of the disclosure are directed to systems and methods for displaying and interacting with a representation of a user interface of an electronic device in a three-dimensional environment. While presenting a first user interface, a first electronic device presents a representation of a second user interface of a second electronic device based on display data representing the second user interface. In response to detecting a respective event corresponding to user input, if a gaze of a user of the first electronic device is directed to the representation of the second user interface, the first electronic device causes the second electronic device to perform a first operation directed to the second user interface based on the respective event. Alternatively, if the gaze is directed to the first user interface, the first electronic device performs a second operation directed to the first user interface based on the respective event.
    Type: Application
    Filed: January 31, 2024
    Publication date: October 31, 2024
    Inventors: Tyler R. CALDERONE, Sean L. SEGUIN, Lorena S. PAZMINO, Aryan SHARIFIAN
  • Publication number: 20240361833
    Abstract: Some examples of the disclosure are directed to systems and methods for displaying and interacting with a representation of a user interface of an electronic device in a three-dimensional environment. While presenting a first user interface, a first electronic device presents a representation of a second user interface of a second electronic device based on display data representing the second user interface. In response to detecting a respective event corresponding to user input, if a gaze of a user of the first electronic device is directed to the representation of the second user interface, the first electronic device causes the second electronic device to perform a first operation directed to the second user interface based on the respective event. Alternatively, if the gaze is directed to the first user interface, the first electronic device performs a second operation directed to the first user interface based on the respective event.
    Type: Application
    Filed: January 31, 2024
    Publication date: October 31, 2024
    Inventors: Tyler R. CALDERONE, Sean L. SEGUIN, Lorena S. Pazmino, Aryan Sharifian
  • Publication number: 20240290052
    Abstract: Implementations of the subject technology provide virtual anchoring for extended reality (XR) display devices. A device may generate an XR environment that includes computer-generated (CG) content for display relative to various physical objects in a physical environment. In order to position the CG content, an XR application may request a physical anchor object to which the CG content can be anchored. In circumstances in which the physical anchor object is not available in the physical environment, a virtual anchor and/or a virtual anchor object corresponding to the physical anchor object can be provided to which the CG content can be anchored.
    Type: Application
    Filed: May 7, 2024
    Publication date: August 29, 2024
    Inventors: Michael E. BUERLI, Samuel L. IGLESIAS, Tyler R. CALDERONE, Mark A. EBBOLE, Andrew P. RICHARDSON
  • Publication number: 20230359425
    Abstract: Implementations of the subject technology provide a small, portable physical object for use in an extended reality system. The system may include a device that allows a user/wearer to interact with virtual representations of content such as stored data and/or applications overlaid on the user’s physical environment. The object has a unique identifier that is obtainable by the device or another device. The object may be an inactive device without any internal circuitry, a passive device with circuitry that is activated by another device, or an active device having its own processing circuitry and/or a display housed in the body of the object. The object can be associated with content displayed by the device to provide a user with a physical object that can be moved or manipulated to move, modify, transport, or store the content generated, stored, and/or displayed in an extended reality environment.
    Type: Application
    Filed: July 17, 2023
    Publication date: November 9, 2023
    Inventors: Samuel L. IGLESIAS, Michael E. BUERLI, Tyler R. CALDERONE, Andrew P. RICHARDSON
  • Publication number: 20230095816
    Abstract: Aspects of the subject technology provide electronic devices that operate, in part, based on enrolled user characteristics, and that can be operated by a guest user that has not been enrolled. For example, upon determining that a current user of an electronic device storing a first physical model of a primary user is a guest user different from the primary user, the electronic device may obtain initial physical characteristic data for the guest user and generate a guest physical model of the guest user based on the initial physical characteristic data. In one or more implementations, the electronic device may operate based on guest user inputs and the guest physical model of the guest user, while updating the guest physical model based on the guest user inputs.
    Type: Application
    Filed: September 16, 2022
    Publication date: March 30, 2023
    Inventors: David COHEN, Kyle C. BROGLE, Michael J. ROCKWELL, Ranjit DESAI, Joel N. KERR, Amy E. DEDONATO, Joaquim Gonçalo LOBO FERREIRA DA SILVA, Tyler R. CALDERONE, Charilaos PAPADOPOULOS
  • Publication number: 20230098174
    Abstract: This relates generally to intelligent automated assistants and, more specifically, to provide a handsfree notification management system. An example method includes displaying one or more notifications by the electronic device. In response to displaying the one or more notifications, the method includes detecting a visual interaction of a user with the one or more notifications, identifying a notification from the one or more notifications based on the visual interaction, receiving a speech input related to the notification from the user, determining one or more actions associated with the notification based on the speech input, and performing the one or more actions.
    Type: Application
    Filed: September 13, 2022
    Publication date: March 30, 2023
    Inventors: Andrea Valentina SIMES, Daniel W. LOO, Felicia W. EDWARDS, Harry M. SIMMONDS, Tyler R. CALDERONE, Kevin D. PITOLIN, Lorena S. PAZMINO
  • Publication number: 20210326091
    Abstract: Implementations of the subject technology provide a small, portable physical object for use in an extended reality system. The system may include a device that allows a user/wearer to interact with virtual representations of content such as stored data and/or applications overlaid on the user's physical environment. The object has a unique identifier that is obtainable by the device or another device. The object may be an inactive device without any internal circuitry, a passive device with circuitry that is activated by another device, or an active device having its own processing circuitry and/or a display housed in the body of the object. The object can be associated with content displayed by the device to provide a user with a physical object that can be moved or manipulated to move, modify, transport, or store the content generated, stored, and/or displayed in an extended reality environment.
    Type: Application
    Filed: February 26, 2021
    Publication date: October 21, 2021
    Inventors: Samuel L. IGLESIAS, Michael E. BUERLI, Tyler R. CALDERONE, Andrew P. RICHARDSON
  • Publication number: 20210326094
    Abstract: Implementations of the subject technology provide continuous transfer of content editing and/or control between various devices in an extended reality system. The extended reality system includes at least one device that is capable of determining the locations of other devices in the system. This device can manage continuous transfer of control between other devices in the system responsive to three-dimensional location-based user inputs, and/or can manage continuous transfer of control between one or more of the other devices and the device itself.
    Type: Application
    Filed: February 26, 2021
    Publication date: October 21, 2021
    Inventors: Michael E. BUERLI, Andrew P. RICHARDSON, Samuel L. IGLESIAS, Tyler R. CALDERONE, Mark A. EBBOLE
  • Publication number: 20210325960
    Abstract: Aspects of the subject technology relate to gaze-based control of an electronic device. The gaze-based control can include enabling an option to provide user authorization when it is determined that the user has viewed and/or read text associated with a request for the user authorization. The gaze-based control can also include modifying a user interface or a user interface element based on user views and/or reads. The gaze-based control can be based on determining whether a user has viewed and/or read an electronic document and/or a physical document.
    Type: Application
    Filed: February 26, 2021
    Publication date: October 21, 2021
    Inventors: Samuel L. IGLESIAS, Mark A. EBBOLE, Andrew P. RICHARDSON, Tyler R. CALDERONE, Michael E. BUERLI, Devin W. CHALMERS
  • Publication number: 20210327146
    Abstract: Implementations of the subject technology provide virtual anchoring for extended reality (XR) display devices. A device may generate an XR environment that includes computer-generated (CG) content for display relative to various physical objects in a physical environment. In order to position the CG content, an XR application may request a physical anchor object to which the CG content can be anchored. In circumstances in which the physical anchor object is not available in the physical environment, a virtual anchor and/or a virtual anchor object corresponding to the physical anchor object can be provided to which the CG content can be anchored.
    Type: Application
    Filed: February 24, 2021
    Publication date: October 21, 2021
    Inventors: Michael E. BUERLI, Samuel L. IGLESIAS, Tyler R. CALDERONE, Mark A. EBBOLE, Andrew P. RICHARDSON