Patents by Inventor Giancarlo Yerkes

Giancarlo Yerkes has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240118746
    Abstract: Gaze enrollment, including displaying an enrollment progress user indicator, animating movement of user interface elements, changing the appearances of user interface elements, and/or moving a user interface element over time, enable a computer system to more accurately track the gaze of a user of the computer system
    Type: Application
    Filed: September 21, 2023
    Publication date: April 11, 2024
    Inventors: Giancarlo YERKES, Adam L. AMADIO, Katherine W. KOLOMBATOVICH, Philipp ROCKEL, William A. SORRENTINO, III, Hana Z. WANG
  • Publication number: 20240103616
    Abstract: Gaze enrollment, including displaying an enrollment progress user indicator, animating movement of user interface elements, changing the appearances of user interface elements, and/or moving a user interface element over time, enable a computer system to more accurately track the gaze of a user of the computer system
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Giancarlo YERKES, Adam L. AMADIO, Kaely COON, Amy E. DEDONATO, Stephen O. LEMAY, Israel PASTRANA VICENTE, William A. SORRENTINO, III, Lynn I. STREJA
  • Publication number: 20240104871
    Abstract: Electronic devices provide extended reality experiences. In some embodiments, a media capture user interface is displayed, including a capture guide. In some embodiments, gaze information is used for targeting. In some embodiments, a virtual object is manipulated.
    Type: Application
    Filed: September 19, 2023
    Publication date: March 28, 2024
    Inventors: Anna L. BREWER, Devin W. CHALMERS, Allison W. DRYER, Elena J. NATTINGER, Giancarlo YERKES
  • Publication number: 20240103678
    Abstract: The present disclosure generally relates to user interfaces for electronic devices, including user interfaces for navigating between and/or interacting with extended reality user interfaces.
    Type: Application
    Filed: September 15, 2023
    Publication date: March 28, 2024
    Inventors: Allison W. DRYER, Anshu K. CHIMALAMARRI, Giancarlo YERKES, Nahckjoon KIM, Stephen O. LEMAY, Jessica TRINH
  • Publication number: 20240103614
    Abstract: In some embodiments, the present disclosure includes techniques and user interfaces for interacting with graphical user interfaces using gaze. In some embodiments, the present disclosure includes techniques and user interfaces for repositioning virtual objects. In some embodiments, the present disclosure includes techniques and user interfaces for transitioning modes of a camera capture user interface.
    Type: Application
    Filed: September 20, 2023
    Publication date: March 28, 2024
    Inventors: Allison W. DRYER, Giancarlo YERKES, Gregory LUTTER, Brian W. TEMPLE, Devin W. CHALMERS, Luis R. DELIZ CENTENO, Elena J. NATTINGER, Anna L. BREWER
  • Publication number: 20240103617
    Abstract: Gaze enrollment, including displaying an enrollment progress user indicator, animating movement of user interface elements, changing the appearances of user interface elements, and/or moving a user interface element over time, enable a computer system to more accurately track the gaze of a user of the computer system.
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Giancarlo YERKES, Adam L. AMADIO, Amy E. DEDONATO, Kirsty KEATCH, Stephen O. LEMAY, Israel PASTRANA VICENTE, Danielle M. PRICE, William A. SORRENTINO, III, Lynn I. STREJA, Hugo D. VERWEIJ, Hana Z. WANG
  • Publication number: 20240104849
    Abstract: In some embodiments, the present disclosure includes techniques and user interfaces for interacting with virtual objects in an extended reality environment. In some embodiments, the techniques and user interfaces are for interacting with virtual objects in an extended reality environment, including repositioning virtual objects relative to the environment. In some embodiments, the techniques and user interfaces are for interacting with virtual objects, in an extended reality environment, including virtual objects that aid a user in navigating within the environment. In some embodiments, the techniques and user interfaces are for interacting with virtual objects, including objects displayed based on changes in a field-of-view of a user, in an extended reality environment, including repositioning virtual objects relative to the environment.
    Type: Application
    Filed: September 6, 2023
    Publication date: March 28, 2024
    Inventors: Yiqiang NIE, Giovanni AGNOLI, Devin CHALMERS, Allison W. DRYER, Thomas G. SALTER, Giancarlo YERKES
  • Publication number: 20240104859
    Abstract: The present disclosure generally relates to managing live communication sessions. A computer system optionally displays an option to invite the respective user to join the ongoing communication session. A computer system optionally one or more options to modify an appearance of an avatar representing the user of the computer system. A computer system optionally transitions a communication session from a spatial communication session to a non-spatial communication session. A computer system optionally displays information about a participant in a communication session.
    Type: Application
    Filed: September 12, 2023
    Publication date: March 28, 2024
    Inventors: Jesse CHAND, Kristi E. BAUERLY, Shih-Sang CHIU, Jonathan R. DASCOLA, Amy E. DEDONATO, Karen EL ASMAR, Wesley M. HOLDER, Stephen O. LEMAY, Lorena S. PAZMINO, Jason D. RICKWALD, Giancarlo YERKES
  • Publication number: 20240094819
    Abstract: In some embodiments, the present disclosure includes techniques and user interfaces for performing operations using air gestures. In some embodiments, the present disclosure includes techniques and user interfaces for audio playback adjustment using gestures. In some embodiments, the present disclosure includes techniques and user interfaces for conditionally responding to inputs.
    Type: Application
    Filed: September 6, 2023
    Publication date: March 21, 2024
    Inventors: Yiqiang NIE, Giovanni M. AGNOLI, Allison W. DRYER, Jules K. FENNIS, Charles MAALOUF, Camille MOUSSETTE, Giancarlo YERKES
  • Publication number: 20240094866
    Abstract: A computer system displays a first view of a user interface of a first application with a first size at a first position corresponding to a location of at least a portion of a palm that is currently facing a viewpoint corresponding to a view of a three-dimensional environment provided via a display generation component. While displaying the first view, the computer system detects a first input that corresponds to a request to transfer display of the first application from the palm to a first surface that is within a first proximity of the viewpoint. In response to detecting the first input, the computer system displays a second view of the user interface of the first application with a second size and an orientation that corresponds to the first surface at a second position defined by the first surface.
    Type: Application
    Filed: November 29, 2023
    Publication date: March 21, 2024
    Inventors: Stephen O. Lemay, Gregg S. Suzuki, Matthew J. Sundstrom, Jonathan Ive, Jeffrey M. Faulkner, Jonathan R. Dascola, William A. Sorrentino, III, Kristi E.S. Bauerly, Giancarlo Yerkes, Peter D. Anton
  • Publication number: 20240077937
    Abstract: The present disclosure generally relates to techniques and user interfaces for controlling and displaying representations of user in environments, such as during a live communication session and/or a live collaboration session.
    Type: Application
    Filed: September 1, 2023
    Publication date: March 7, 2024
    Inventors: Jason D. RICKWALD, Andrew R. BACON, Kristi E. BAUERLY, Rupert BURTON, Jordan A. CAZAMIAS, Tong CHEN, Shih-Sang CHIU, Jonathan PERRON, Giancarlo YERKES
  • Patent number: 11922584
    Abstract: An electronic device displays a first user interface with a first representation of content and, in response to receiving a request to display a virtual model that corresponds to the content, displays the virtual model of the content concurrently with a selectable user interface object for performing the operation associated with the content in accordance with a determination that the first user interface is configured to perform an operation associated with the content and displaying the virtual model of the content without displaying the selectable user interface object for performing the operation associated with the content in accordance with a determination that the first user interface is not configured to perform the operation associated with the content.
    Type: Grant
    Filed: October 1, 2021
    Date of Patent: March 5, 2024
    Assignee: APPLE INC.
    Inventors: Grant R. Paul, Giancarlo Yerkes, Nicolas V. Scapel, David Lui
  • Publication number: 20240053859
    Abstract: A computer system displays, in a simulated three-dimensional space, an object with a user interface displayed at a pose corresponding to a pose of the object in the simulated space, the object's pose corresponding to a pose of an input device in a physical environment. In response to detecting a movement input via the input device: if the movement input corresponds to input device movement, relative to the physical environment, meeting pose criteria requiring that a parameter of change in the input device pose meet a set of one or more thresholds, the computer system displays the user interface away from the object; and, if the movement input corresponds to input device movement not meeting the pose criteria, the computer system updates the object's pose in the simulated space based on the input device movement, while maintaining display of the user interface at a pose corresponding to the object's pose.
    Type: Application
    Filed: October 26, 2023
    Publication date: February 15, 2024
    Inventors: Jeffrey M. Faulkner, Wesley M. Holder, Giancarlo Yerkes, Israel Pastrana Vicente, William A. Sorrentino, III, Stephen O. Lemay
  • Publication number: 20240056563
    Abstract: To blur the physical environment viewable through a transparent display, a system may capture images of the physical environment using a camera, warp the images of the physical environment from the perspective of the camera to the perspective of a user's eye, and display a blurred version of the warped images. The transparent display overlays a blurred version of the physical environment over the physical environment such that the physical environment appears blurred. This type of blur effect may be applied across the entire physical environment that is viewable through the transparent display or across only a subset of the physical environment that is viewable through the transparent display.
    Type: Application
    Filed: June 21, 2023
    Publication date: February 15, 2024
    Inventors: Rahul Nair, Giancarlo Yerkes
  • Publication number: 20240045562
    Abstract: A system displays a representation of a camera field of view including a first portion of a physical environment, captures depth information indicative of a first subset of the first portion, and displays, overlaid on a portion of the representation of the field of view corresponding to the first subset, an indication of an extent of the first portion for which depth information has been captured. In response to detecting movement of the field of view, the system: updates the representation of the field of view to include a representation of a second portion of the physical environment; captures depth information indicative of a second subset of the second portion; and updates the indication to indicate an extent of the second portion for which depth information has been captured, including displaying the indication overlaid on a portion of the representation of the field of view corresponding to the second subset.
    Type: Application
    Filed: October 20, 2023
    Publication date: February 8, 2024
    Inventors: Allison W. Dryer, Giancarlo Yerkes, Grant R. Paul, Lisa K. Forssell, Joseph A. Malia
  • Patent number: 11875013
    Abstract: A computer system detects a wrist. In accordance with a determination that first criteria that require an inner side of the wrist facing toward a viewpoint are met, the computer system displays a first user interface object including a plurality of representations of different applications at a first position corresponding to a first location on the wrist. While displaying the first user interface object, the computer system detects that the wrist's position or orientation has changed to satisfying second criteria that requires an outer side of the wrist facing toward the viewpoint. In response, the computer system switches from displaying the first user interface object at the first position to displaying a second user interface object including a plurality of controls for controlling functions at a second position corresponding to a location on a back of a hand attached to the wrist.
    Type: Grant
    Filed: December 10, 2020
    Date of Patent: January 16, 2024
    Assignee: APPLE INC.
    Inventors: Stephen O. Lemay, Gregg S. Suzuki, Matthew J. Sundstrom, Jonathan Ive, Robert T. Tilton, Jeffrey M. Faulkner, Jonathan R. Dascola, William A. Sorrentino, III, Kristi E. S. Bauerly, Giancarlo Yerkes, Peter D. Anton
  • Patent number: 11861136
    Abstract: A computer system displays a view of at least a portion of a simulated three-dimensional space, and a view of a user interface object located within the simulated three-dimensional space. The user interface object is a representation of a computing device that has a non-immersive display environment that provides access to a plurality of different applications. The user interface object includes a first user interface that corresponds to the non-immersive display environment of the computing device and is responsive to touch inputs from a user on the input device, and a pose of the user interface object in the simulated three-dimensional space corresponds to a pose of the input device in a physical space surrounding the input device. In response to a touch input that corresponds to a respective location in the first user interface, an appearance of the first user interface is updated.
    Type: Grant
    Filed: September 21, 2018
    Date of Patent: January 2, 2024
    Assignee: APPLE INC.
    Inventors: Jeffrey M. Faulkner, Wesley M. Holder, Giancarlo Yerkes, Israel Pastrana Vicente, William A. Sorrentino, III, Stephen O. Lemay
  • Patent number: D1009918
    Type: Grant
    Filed: June 6, 2022
    Date of Patent: January 2, 2024
    Assignee: Apple Inc.
    Inventors: Freddy Anzures, Gary Butcher, Joseph Chan, Imran Chaudhri, Alan C. Dye, Jonathan P. Ive, Lawrence Yang, Giancarlo Yerkes
  • Patent number: D1012963
    Type: Grant
    Filed: April 30, 2021
    Date of Patent: January 30, 2024
    Assignee: Apple Inc.
    Inventors: Freddy Anzures, Hoan K. Pham, Giancarlo Yerkes
  • Patent number: D1018588
    Type: Grant
    Filed: January 30, 2023
    Date of Patent: March 19, 2024
    Assignee: Apple Inc.
    Inventors: Allison W. Dryer, Alan C. Dye, Stephen O. Lemay, Richard D. Lyons, Grant R. Paul, Giancarlo Yerkes