Patents by Inventor Jonathan R. DASCOLA

Jonathan R. DASCOLA has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240152245
    Abstract: A computer system displays a first object that includes at least a first portion of the first object and a second portion of the first object and detects a first gaze input that meets first criteria, wherein the first criteria require that the first gaze input is directed to the first portion of the first object in order for the first criteria to be met. In response, the computer system displays a first control element that corresponds to a first operation associated with the first object, wherein the first control element was not displayed prior to detecting that the first gaze input met the first criteria, and detects a first user input directed to the first control element. In response to detecting the first user input directed to the first control element, the computer system performs the first operation with respect to the first object.
    Type: Application
    Filed: September 21, 2023
    Publication date: May 9, 2024
    Inventors: Lee S. Broughton, Israel Pastrana Vicente, Matan Stauber, Miquel Estany Rodriguez, James J. Owen, Jonathan R. Dascola, Stephen O. Lemay, Christian Schnorr, Zoey C. Taylor, Jay Moon, Benjamin H. Boesel, Benjamin Hylak, Richard D. Lyons, Willliam A. Sorrentino, III, Lynn I. Streja, Jonathan Ravasz, Nathan Gitter, Peter D. Anton, Michael J. Rockwell, Peter L. Hajas, Evgenii Krivoruchko, Mark A. Ebbole, James Magahern, Andrew J. Sawyer, Christopher D. McKenzie, Michael E. Buerli, Olivier D. R. Gutknecht
  • Publication number: 20240152256
    Abstract: A computer system concurrently displays, via a display generation component, a browser toolbar, for a browser that includes a plurality of tabs and a window including first content associated with a first tab of the plurality of tabs. The browser toolbar and the window are overlaying a view of a three-dimensional environment. While displaying the browser toolbar and the window that includes the first content overlaying the view of the three-dimensional environment, the computer system detects a first air gesture that meets first gesture criteria, the air gesture comprising a gaze input directed at a location in the view of the three-dimensional environment that is occupied by the browser toolbar and a hand movement. In response to detecting the first air gesture that meets the first gesture criteria, the computer system displays second content in the window, the second content associated with a second tab of the plurality of tabs.
    Type: Application
    Filed: September 18, 2023
    Publication date: May 9, 2024
    Inventors: Jonathan R. Dascola, Nathan Gitter, Jay Moon, Stephen O. Lemay, Joseph M.W. Luxton, Angel Suet Y. Cheung, Danielle M. Price, Hugo D. Verweij, Kristi E.S. Bauerly, Katherine W. Kolombatovich, Jordan A. Cazamias
  • Patent number: 11968056
    Abstract: Avatars may be displayed in a multiuser communication session using various spatial modes. One technique for presenting avatars includes presenting avatars such that an attention direction of the avatar is retargeted to match the intent of the remote user corresponding to the avatar. Another technique for presenting avatars includes a pinned mode in which a spatial relationship between one or more avatars remains displayed in a consistent spatial relationship to a local user regardless of movements of the local user. Another technique for presenting avatars includes providing user-selectable presentation modes between a room scale mode and a stationary mode for presenting a representation of a multiuser communication session.
    Type: Grant
    Filed: March 23, 2023
    Date of Patent: April 23, 2024
    Assignee: Apple Inc.
    Inventors: Connor A. Smith, Bruno M. Sommer, Jonathan R. Dascola, Nicholas W. Henderson, Timofey Grechkin
  • Publication number: 20240126425
    Abstract: An electronic device displays a wake screen that includes a representative image in a sequence of images. While displaying the wake screen, the electronic device detects an input that includes a contact over the wake screen. In response to detecting the contact over the wake screen, the electronic device displays, in sequence, a plurality of images in the sequence of images.
    Type: Application
    Filed: November 28, 2023
    Publication date: April 18, 2024
    Inventors: Nicholas V. King, Christopher P. Foss, Jonathan R. Dascola, Daniel T. Preston, Behkish J. Manzari, Justin S. Titi, Henrique D. Penha, Graham R. Clarke
  • Patent number: 11954242
    Abstract: A computer system presents first computer-generated content. While presenting the first computer-generated content, the computer system detects first movement of a first user in a physical environment, and in response: in accordance with a determination that the first movement changes a spatial relationship between the first user and a second user in the physical environment from a first spatial relationship to a second spatial relationship and that the change in spatial relationship meets first criteria, the computer system changes one or more output properties of the first computer-generated content; and in accordance with the determination that the first movement changes the spatial relationship from the first spatial relationship to the second spatial relationship and that the change in spatial relationship does not meet the first criteria, the computer system presents the first computer-generated content without changing the one or more output properties of the first computer-generated content.
    Type: Grant
    Filed: December 28, 2021
    Date of Patent: April 9, 2024
    Assignee: APPLE INC.
    Inventors: Jonathan R. Dascola, Israel Pastrana Vicente, Peter D. Anton, Stephen O. Lemay, William A. Sorrentino, III, Kristi E. S. Bauerly, Philipp Rockel, Dorian D. Dargan
  • Publication number: 20240103681
    Abstract: A computer system displays a first user interface object and a first control element with a first appearance that is associated with performing a first operation with respect to the first user interface object, in a first view of a three-dimensional environment. The computer system detects a first gaze input that is directed to the first control element, and in response, updates an appearance of the first control element from to a second appearance that is different from the first appearance. While displaying the first control element with the second appearance, the computer system detects a first user input directed to the first control element, and in accordance with a determination that the first user input meets first criteria, updates the appearance of the first control element from the second appearance to a third appearance that is different from the first appearance and the second appearance.
    Type: Application
    Filed: September 21, 2023
    Publication date: March 28, 2024
    Inventors: Lee S. Broughton, Israel Pastrana Vicente, Matan Stauber, Jonathan R. Dascola, Stephen O. Lemay, Zoey C. Taylor
  • Publication number: 20240103694
    Abstract: An electronic device with a display and a touch-sensitive surface: displays a first user interface that includes a plurality of selectable objects; while a focus selector is at a location that corresponds to a respective selectable object, detects an input that includes detecting a contact on the touch-sensitive surface; and in response to detecting the input: in accordance with a determination that detecting the input meeting input criteria, including a criterion that is met when the contact meets a respective input threshold, displays a menu that includes contact information for the respective selectable object overlaid on top of the first user interface; and in accordance with a determination that detecting the input includes detecting a liftoff of the contact without meeting the input criteria, replaces display of the first user interface with display of a second user interface.
    Type: Application
    Filed: December 1, 2023
    Publication date: March 28, 2024
    Inventors: Christopher P. Foss, Jonathan R. Dascola, Marcos Alonso Ruiz, Chanaka G. Karunamuni, Stephen O. Lemay, Gregory M. Apodaca, Wan Si Wan, Kenneth L. Kocienda, Sebastian J. Bauer, Alan C. Dye, Jonathan Ive
  • Publication number: 20240104877
    Abstract: In some embodiments, a computer system applies a time of day setting to a virtual environment. In some embodiments, the time of day setting is updated based on an event. In some embodiments, a computer system displays content in an expanded display mode. In some embodiments, computer systems join a communication session while maintaining display of respective environments. In some embodiments, a computer system moves a portal based on user movement. In some embodiments, computer systems share a virtual environment. Computer systems can display media with simulated lighting. Computer systems can share an environment. In some embodiments, a computer system selects a position relative to content. A computer system can present representations of communication session participants based on content. A computer system can present user interfaces to control visual appearances of an environment including media. Computer systems can change an appearance of an environment based on environmental modes.
    Type: Application
    Filed: September 24, 2023
    Publication date: March 28, 2024
    Inventors: Nicholas W. HENDERSON, James M. DESSERO, Matan STAUBER, Katherine W. KOLOMBATOVICH, Stephen O. LEMAY, William A. SORRENTINO, III, Jonathan R. DASCOLA
  • Publication number: 20240103676
    Abstract: A gaze virtual object is displayed that is selectable based on attention directed to the gaze virtual object to perform an operation associated with a selectable virtual object. An indication of attention of a user is displayed. An enlarged view of a region of a user interface is displayed. A value of a slider element is adjusted based on attention of a user. A user interface element is moved at a respective rate based on attention of a user. Text is entered into a text entry field in response to speech inputs. A value for a value selection user interface object is updated based on attention of a user. Movement of a virtual object is facilitated based on direct touch interactions. A user input is facilitated for displaying a selection refinement user interface object. A visual indicator is displayed indicating progress toward selecting a virtual object when criteria are met.
    Type: Application
    Filed: September 22, 2023
    Publication date: March 28, 2024
    Inventors: Israel PASTRANA VICENTE, Danielle M. PRICE, Evgenii KRIVORUCHKO, Jonathan R. DASCOLA, Jonathan RAVASZ, Marcos ALONSO, Hugo D. VERWEIJ, Zoey C. TAYLOR, Lorena S. PAZMINO
  • Publication number: 20240104859
    Abstract: The present disclosure generally relates to managing live communication sessions. A computer system optionally displays an option to invite the respective user to join the ongoing communication session. A computer system optionally one or more options to modify an appearance of an avatar representing the user of the computer system. A computer system optionally transitions a communication session from a spatial communication session to a non-spatial communication session. A computer system optionally displays information about a participant in a communication session.
    Type: Application
    Filed: September 12, 2023
    Publication date: March 28, 2024
    Inventors: Jesse CHAND, Kristi E. BAUERLY, Shih-Sang CHIU, Jonathan R. DASCOLA, Amy E. DEDONATO, Karen EL ASMAR, Wesley M. HOLDER, Stephen O. LEMAY, Lorena S. PAZMINO, Jason D. RICKWALD, Giancarlo YERKES
  • Publication number: 20240094866
    Abstract: A computer system displays a first view of a user interface of a first application with a first size at a first position corresponding to a location of at least a portion of a palm that is currently facing a viewpoint corresponding to a view of a three-dimensional environment provided via a display generation component. While displaying the first view, the computer system detects a first input that corresponds to a request to transfer display of the first application from the palm to a first surface that is within a first proximity of the viewpoint. In response to detecting the first input, the computer system displays a second view of the user interface of the first application with a second size and an orientation that corresponds to the first surface at a second position defined by the first surface.
    Type: Application
    Filed: November 29, 2023
    Publication date: March 21, 2024
    Inventors: Stephen O. Lemay, Gregg S. Suzuki, Matthew J. Sundstrom, Jonathan Ive, Jeffrey M. Faulkner, Jonathan R. Dascola, William A. Sorrentino, III, Kristi E.S. Bauerly, Giancarlo Yerkes, Peter D. Anton
  • Patent number: 11934569
    Abstract: A computer system displays a first and second user interface object in a three-dimensional environment. The first and second user interface objects have a first and second spatial relationship to a first and second anchor position corresponding to a location of a user's hand in a physical environment, respectively. While displaying the first and second user interface objects in the three-dimensional environment, the computer system detects movement of the user's hand in the physical environment, corresponding to a translational movement and a rotational movement of the user's hand relative to a viewpoint, and in response, translates the first and second user interface objects relative to the viewpoint in accordance with the translational movement of the user's hand, and rotates the first user interface object relative to the viewpoint in accordance with the rotational movement of the user's hand without rotating the second user interface object.
    Type: Grant
    Filed: September 19, 2022
    Date of Patent: March 19, 2024
    Assignee: APPLE INC.
    Inventors: Israel Pastrana Vicente, Jonathan R. Dascola, Christopher D. McKenzie, Jesse Chand, Stephen O. Lemay, Kristi E. S. Bauerly, Zoey C. Taylor
  • Patent number: 11921975
    Abstract: An application launching user interface that includes a plurality of application icons for launching corresponding applications is displayed. A first touch input is detected on a first application icon of the plurality of application icons. The first application icon is for launching a first application that is associated with one or more corresponding quick actions. If the first touch input meets one or more application-launch criteria which require that the first touch input has ended without having met a first input threshold, the first application is launched in response to the first touch input. If the first touch input meets one or more quick-action-display criteria which require that the first touch input meets the first input threshold, one or more quick action objects associated with the first application are concurrently displayed along with the first application icon without launching the first application, in response to the first touch input.
    Type: Grant
    Filed: November 24, 2020
    Date of Patent: March 5, 2024
    Assignee: APPLE INC.
    Inventors: Jonathan R. Dascola, Marcos Alonso Ruiz, Chanaka G. Karunamuni, Stephen O. Lemay, Gregory M. Apodaca, Nicholas V. King, Daniel T. Preston
  • Publication number: 20240061567
    Abstract: The present disclosure relates to user interfaces for manipulating user interface objects. A device, including a display and a rotatable input mechanism, is described in relation to manipulating user interface objects. In some examples, the manipulation of the object is a scroll, zoom, or rotate of the object. In other examples, objects are selected in accordance with simulated magnetic properties.
    Type: Application
    Filed: May 22, 2023
    Publication date: February 22, 2024
    Inventors: Nicholas ZAMBETTI, Imran CHAUDHRI, Jonathan R. DASCOLA, Alan C. DYE, Christopher Patrick FOSS, Aurelio GUZMAN, Chanaka G. KARUNAMUNI, Duncan Robert KERR, Christopher WILSON, Eric Lance WILSON, Lawrence Y. YANG, Gary Ian BUTCHER, Nathan DE VRIES, Jonathan P. IVE
  • Publication number: 20240045579
    Abstract: A representation displayed in a three-dimensional environment may be selected with different types of selection inputs. When a representation displayed in the three-dimensional environment is selected, an application corresponding to the representation may be launched in the three-dimensional environment in accordance with the type of selection received. In such embodiments, when the representation is selected with a first type of selection, the application corresponding to the selected representation may be launched in a first manner, and when the representation is selected with a second type of selection, the application corresponding to the selected representation may be launched in a second manner. In some embodiments, the manner in which the application is launched may determine whether one or more previously launched applications continue to be displayed or cease to be displayed in the three-dimensional environment.
    Type: Application
    Filed: December 21, 2021
    Publication date: February 8, 2024
    Inventors: Nathan GITTER, Aaron M. BURNS, Benjamin HYLAK, Jonathan R. DASCOLA, Alexis H. PALANGIE
  • Publication number: 20240036703
    Abstract: The present disclosure relates to electronic message user interfaces. A device, including a display, a touch-sensitive surface, and a rotatable input mechanism, is described in relation to accessing, composing, and manipulating electronic messages. In response to detecting the user input activating the electronic conversation object, the device displays, one or more messages in an electronic conversation corresponding to the activated electronic conversation object. While displaying the electronic conversation, user input is received. If the user input is a rotation of the rotatable input mechanism, the device displays, on the display, an affordance associated with replying to the electronic conversation.
    Type: Application
    Filed: September 29, 2023
    Publication date: February 1, 2024
    Inventors: Lawrence Y. YANG, Stephen O. LEMAY, Alan C. DYE, Christopher Patrick FOSS, Christopher WILSON, Imran CHAUDHRI, Gary Ian BUTCHER, Jonathan R. DASCOLA
  • Publication number: 20240029734
    Abstract: At an electronic device with a display, a microphone, and an input device: while the display is on, receiving user input via the input device, the user input meeting a predetermined condition; in accordance with receiving the user input meeting the predetermined condition, sampling audio input received via the microphone; determining whether the audio input comprises a spoken trigger; and in accordance with a determination that audio input comprises the spoken trigger, triggering a virtual assistant session.
    Type: Application
    Filed: September 26, 2023
    Publication date: January 25, 2024
    Inventors: Stephen O. LEMAY, Brandon J. NEWENDORP, Jonathan R. DASCOLA
  • Publication number: 20240020371
    Abstract: The present disclosure generally relates to user interfaces for electronic devices, including user interfaces for user authentication and device management.
    Type: Application
    Filed: September 28, 2023
    Publication date: January 18, 2024
    Inventors: Amy E. DEDONATO, Jonathan R. DASCOLA, Katherine W. KOLOMBATOVICH, Vitalii KRAMAR, Jay MOON
  • Publication number: 20240019999
    Abstract: An electronic device displays a control user interface that includes a plurality of control affordances. The device detects a first input directed to a location that corresponds to a first control affordance, of the plurality of control affordances. In response to detecting the first input, if the first input meets control toggle criteria, the device toggles a function of a control that corresponds to the first control affordance. And if the first input meets enhanced control criteria, the device displays modification options for the control that correspond to the first control affordance. While displaying the modification options, the device detects a second input that activates a modification option of the modification options and, accordingly, modifies the control that corresponds to the first control affordance.
    Type: Application
    Filed: July 11, 2023
    Publication date: January 18, 2024
    Inventors: Jonathan R. Dascola, Chanaka G. Karunamuni, Christopher P. Foss, Sebastian J. Bauer, Arian Behzadi, David C. Graham
  • Patent number: 11875013
    Abstract: A computer system detects a wrist. In accordance with a determination that first criteria that require an inner side of the wrist facing toward a viewpoint are met, the computer system displays a first user interface object including a plurality of representations of different applications at a first position corresponding to a first location on the wrist. While displaying the first user interface object, the computer system detects that the wrist's position or orientation has changed to satisfying second criteria that requires an outer side of the wrist facing toward the viewpoint. In response, the computer system switches from displaying the first user interface object at the first position to displaying a second user interface object including a plurality of controls for controlling functions at a second position corresponding to a location on a back of a hand attached to the wrist.
    Type: Grant
    Filed: December 10, 2020
    Date of Patent: January 16, 2024
    Assignee: APPLE INC.
    Inventors: Stephen O. Lemay, Gregg S. Suzuki, Matthew J. Sundstrom, Jonathan Ive, Robert T. Tilton, Jeffrey M. Faulkner, Jonathan R. Dascola, William A. Sorrentino, III, Kristi E. S. Bauerly, Giancarlo Yerkes, Peter D. Anton