Patents by Inventor James N. JONES

James N. JONES has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250240516
    Abstract: User interfaces integrating variable hardware inputs are described, including user interfaces that perform different operations in response to presses of different intensities with and without movement, user interfaces that perform different media capture operations based on presses of different lengths with and without movement, user interfaces with controls that can be customized by swiping near a hardware control, user interfaces for media playback with dynamic speed adjustments, and user interfaces that perform adjustments before and after contact with a hardware control ends.
    Type: Application
    Filed: September 20, 2024
    Publication date: July 24, 2025
    Inventors: Adrian ZUMBRUNNEN, Johnnie B. MANZARI, James N. JONES, Jennifer A. LAPLACA, Ramon GILABERT LLOP, Craig M. FEDERIGHI, Nathan M. PACZAN, Jeffrey E. WILLIAMS, Jamie L. MYROLD, William A. SORRENTINO, III, Nathan DE VRIES
  • Publication number: 20250036267
    Abstract: Systems and processes for providing and updating user interfaces for search functions are provided. An example process includes, at a computer system including a touch sensitive display and one or more input devices: receiving, via the one or more input devices, a request to display a user interface; in response to receiving the request to display the user interface, displaying, via the touch sensitive display, the user interface including an affordance including an indication of a number of pages of the user interface; and while displaying the user interface: detecting an occurrence of a predetermined condition; and in response to detecting the occurrence of the predetermined conditions, updating the affordance to include an indication of a search function.
    Type: Application
    Filed: October 10, 2024
    Publication date: January 30, 2025
    Inventors: Cameron BURGESS, Alan C. DYE, Craig M. FEDERIGHI, Ramon GILABERT LLOP, James N. JONES, Jamie L. MYROLD
  • Publication number: 20240370141
    Abstract: Systems and processes for providing search user interfaces are provided. An example process includes while displaying a home screen, in response to detecting a gesture input, displaying a search user interface; while displaying the search user interface, detecting an input selecting a result; in response to detecting the input selecting the result, displaying an application user interface associated with the result; while displaying the application user interface: detecting an input; and in response to detecting the input: in accordance with a determination that the input is detected before a time threshold, displaying the search user interface with a state corresponding to the detection of the input selecting the result; in accordance with a determination that the input of the predetermined type is detected after the time threshold or after another input is detected, displaying the home screen.
    Type: Application
    Filed: April 9, 2024
    Publication date: November 7, 2024
    Inventors: Cameron BURGESS, Nathan DE VRIES, Ramon GILABERT LLOP, James N. JONES, Shubham KEDIA, Jedidiah LEWISON, Jamie L. MYROLD, Nikhil SANCHETI
  • Publication number: 20240364645
    Abstract: The present disclosure generally relates to techniques and user interfaces for creating, editing, and using stickers. In some embodiments, a sticker can be created using a media item. In some embodiments, a sticker can be edited to apply a visual effect. In some embodiments, a sticker can be suggested for use in a messaging interface.
    Type: Application
    Filed: November 27, 2023
    Publication date: October 31, 2024
    Inventors: Adrian ZUMBRUNNEN, Christian X. DALONZO, James N. JONES, Johnnie B. MANZARI, Grant R. PAUL, Marcel VAN OS
  • Publication number: 20240319959
    Abstract: An example process includes: displaying, on a display of an electronic device, an extended reality (XR) environment corresponding to a copresence session including the electronic device and a second electronic device; while displaying the XR environment: sampling, with a microphone of the electronic device, a first audio input; determining whether the first audio input is intended for a first digital assistant operating on an external electronic device; and in accordance with a determination that the first audio input is intended for the first digital assistant: causing the first digital assistant to provide an audible response to the first audio input, where the audible response is not transmitted to the second electronic device over a shared communication channel for the copresence session.
    Type: Application
    Filed: February 23, 2022
    Publication date: September 26, 2024
    Inventors: Jessica J. PECK, James N. JONES, Ieyuki KAWASHIMA, Lynn I. STREJA, Stephen O. LEMAY
  • Publication number: 20240256100
    Abstract: The present disclosure generally relates to methods and user interfaces for managing visual content at a computer system. In some embodiments, methods and user interfaces for managing visual content in media are described. In some embodiments, methods and user interfaces for managing visual indicators for visual content in media are described. In some embodiments, methods and user interfaces for inserting visual content in media are described. In some embodiments, methods and user interfaces for identifying visual content in media are described. In some embodiments, methods and user interfaces for translating visual content in media are described. In some embodiments, methods and user interfaces for translating visual content in media are described. In some embodiments, methods and user interfaces for managing user interface objects for visual content in media are described.
    Type: Application
    Filed: March 20, 2024
    Publication date: August 1, 2024
    Inventors: Grant R. PAUL, Steven D. BAKER, Brandon J. COREY, Neil G. CRANE, Matthias DANTONE, Nathan DE VRIES, Craig M. FEDERIGHI, James N. JONES, Xishuo LIU, Johnnie B. MANZARI, Sebastien V. MARINEAU-MES, Pulah J. SHAH, Andre SOUZA DOS SANTOS, Srinivasan VENKATACHARY, Yang ZHAO
  • Publication number: 20240231558
    Abstract: An example process includes while displaying, on a display, an extended reality (XR) environment: receiving a user input; sampling, with a microphone, a user speech input; in accordance with a determination that the user input satisfies a criterion for initiating a digital assistant, initiating the digital assistant, including: displaying, within the XR environment, a digital assistant indicator at a first location of the XR environment; and while displaying the digital assistant indicator at the first location, providing, by the digital assistant, a response to the user speech input; after providing the response, ceasing to display the digital assistant indicator at the first location; and in accordance with ceasing to display the digital assistant indicator at the first location, displaying the digital assistant indicator at a second location of the XR environment, the second location corresponding to a physical location of an external electronic device implementing a second digital assistant.
    Type: Application
    Filed: February 22, 2022
    Publication date: July 11, 2024
    Inventors: Jessica J. PECK, James N. JONES, Ieyuki KAWASHIMA, Lynn I. STREJA, Stephen O. LEMAY, William A. SORRENTINO
  • Publication number: 20240220292
    Abstract: Systems and processes for operating an intelligent automated assistant are provided. Upon receiving a user input requesting that a digital assistant initiate a task, performance of the task is initiated, and a task response is provided. If the digital assistant determines that task is associated with a first user interface, such as a non-digital assistant-specific user interface, the task response is provided using the first user interface.
    Type: Application
    Filed: March 12, 2024
    Publication date: July 4, 2024
    Inventors: Neal S. ELLIS, Cameron BURGESS, Pedro MARI, Michael R. SUMNER, Trungtin TRAN, James N. JONES
  • Publication number: 20240134492
    Abstract: An example process includes while displaying, on a display, an extended reality (XR) environment: receiving a user input; sampling, with a microphone, a user speech input; in accordance with a determination that the user input satisfies a criterion for initiating a digital assistant, initiating the digital assistant, including: displaying, within the XR environment, a digital assistant indicator at a first location of the XR environment; and while displaying the digital assistant indicator at the first location, providing, by the digital assistant, a response to the user speech input; after providing the response, ceasing to display the digital assistant indicator at the first location; and in accordance with ceasing to display the digital assistant indicator at the first location, displaying the digital assistant indicator at a second location of the XR environment, the second location corresponding to a physical location of an external electronic device implementing a second digital assistant.
    Type: Application
    Filed: February 22, 2022
    Publication date: April 25, 2024
    Inventors: Jessica J. PECK, James N. JONES, Ieyuki KAWASHIMA, Lynn I. STREJA
  • Publication number: 20230393872
    Abstract: Systems and processes for operating an intelligent automated assistant are provided. Upon receiving a user input requesting that a digital assistant initiate a task, performance of the task is initiated, and a task response is provided. If the digital assistant determines that task is associated with a first user interface, such as a non-digital assistant-specific user interface, the task response is provided using the first user interface.
    Type: Application
    Filed: September 19, 2022
    Publication date: December 7, 2023
    Inventors: Neal S. ELLIS, Cameron BURGESS, Pedro MARI, Michael R. SUMNER, Trungtin TRAN, James N. JONES
  • Publication number: 20230379427
    Abstract: The present disclosure generally relates to managing media representations. In some embodiments, systems, methods, and user interfaces are provided for managing the background of a media representation, copying subjects of a media representation, converting one or more portions of a media representation, providing descriptions for one or more symbols in a media representation, and providing one or more animations for one or more detected objects and/or subjects.
    Type: Application
    Filed: August 7, 2023
    Publication date: November 23, 2023
    Inventors: Johnnie B. MANZARI, Grant R. PAUL, James N. JONES, Adrian ZUMBRUNNEN
  • Publication number: 20230367777
    Abstract: Systems and processes for providing a search interface with contextual suggestions are provided. For example, a first user input to activate a search interface is received. In response to receiving the first user input, a first context associated with the electronic device is identified. In accordance with a determination that a first confidence score associated with the first context exceeds a first threshold confidence score, a first task corresponding to the first context is displayed. In accordance with a determination that the first confidence score associated with the first context does not exceed a threshold confidence score, a second task associated with a previous interaction context is displayed.
    Type: Application
    Filed: October 21, 2022
    Publication date: November 16, 2023
    Inventors: Cameron BURGESS, James N. JONES, Jamie L. MYROLD, Matthew E. HURLEY, Sofiane TOUDJI
  • Publication number: 20230367458
    Abstract: Systems and processes for providing and updating user interfaces for search functions are provided. An example process includes, at a computer system including a touch sensitive display and one or more input devices: receiving, via the one or more input devices, a request to display a user interface; in response to receiving the request to display the user interface, displaying, via the touch sensitive display, the user interface including an affordance including an indication of a number of pages of the user interface; and while displaying the user interface: detecting an occurrence of a predetermined condition; and in response to detecting the occurrence of the predetermined conditions, updating the affordance to include an indication of a search function.
    Type: Application
    Filed: September 20, 2022
    Publication date: November 16, 2023
    Inventors: Cameron BURGESS, Ramon GILABERT LLOP, James N. JONES, Jamie L. MYROLD, Alan C. DYE, Eric L. WILSON
  • Publication number: 20230367795
    Abstract: Systems and processes for navigating and performing device tasks using a search interface are provided. For example, a first user input including at least one character is received. Based on the first user input, a first task among a plurality of tasks is displayed, wherein the first task is associated with a confidence value exceeding a threshold confidence value. A second input corresponding to an activation on the first task is received. In accordance with a determination that the activated task corresponds to a predetermined task type performance of the activated task is initiated, and a respective interface corresponding to the initiated task is displayed. In accordance with a determination that the activated task does not correspond to the predetermined task type, the system causes the activated task to be performed without displaying the respective interface.
    Type: Application
    Filed: October 12, 2022
    Publication date: November 16, 2023
    Inventors: Cameron BURGESS, James N. JONES, Jamie L. MYROLD, Ari R. WEINSTEIN
  • Publication number: 20230352007
    Abstract: An example process includes: receiving a first natural language input; initiating, by a digital assistant operating on the electronic device, a first task based on the first natural language input; determining whether the first task is of a predetermined type; and in accordance with a determination that the first task is of a predetermined type: determining whether one or more criteria are satisfied; and providing a response to the first natural language input, where providing the response includes: in accordance with a determination that the one or more criteria are not satisfied, outputting a first sound indicative of the initiated first task and a first verbal response indicative of the initiated first task; and in accordance with a determination that the one or more criteria are satisfied, outputting the first sound without outputting the first verbal response.
    Type: Application
    Filed: September 21, 2022
    Publication date: November 2, 2023
    Inventors: Daniel A. CASTELLANI, James N. JONES, Pedro MARI, Jessica J. PECK, Hugo D. VERWEIJ, Garrett L. WEINBERG
  • Publication number: 20230319224
    Abstract: The present disclosure generally relates to managing media representations. In some embodiments, systems, methods, and user interfaces are provided for managing the background of a media representation, copying subjects of a media representation, converting one or more portions of a media representation, providing descriptions for one or more symbols in a media representation, and providing one or more animations for one or more detected objects and/or subjects.
    Type: Application
    Filed: February 10, 2023
    Publication date: October 5, 2023
    Inventors: Behkish J. MANZARI, Grant R. PAUL, James N. JONES, Adrian ZUMBRUNNEN
  • Publication number: 20230229279
    Abstract: The present disclosure generally relates to methods and user interfaces for managing visual content at a computer system. In some embodiments, methods and user interfaces for managing visual content in media are described. In some embodiments, methods and user interfaces for managing visual indicators for visual content in media are described. In some embodiments, methods and user interfaces for inserting visual content in media are described. In some embodiments, methods and user interfaces for identifying visual content in media are described. In some embodiments, methods and user interfaces for translating visual content in media are described. In some embodiments, methods and user interfaces for translating visual content in media are described. In some embodiments, methods and user interfaces for managing user interface objects for visual content in media are described.
    Type: Application
    Filed: March 22, 2023
    Publication date: July 20, 2023
    Inventors: Grant R. PAUL, Kellie L. ALBERT, Nathan DE VRIES, James N. JONES
  • Publication number: 20220334693
    Abstract: The present disclosure generally relates to methods and user interfaces for managing visual content at a computer system. In some embodiments, methods and user interfaces for managing visual content in media are described. In some embodiments, methods and user interfaces for managing visual indicators for visual content in media are described. In some embodiments, methods and user interfaces for inserting visual content in media are described. In some embodiments, methods and user interfaces for identifying visual content in media are described. In some embodiments, methods and user interfaces for translating visual content in media are described.
    Type: Application
    Filed: September 24, 2021
    Publication date: October 20, 2022
    Inventors: Nathan DE VRIES, Thomas DESELAERS, James N. JONES, Behkish J. MANZARI, Grant PAUL, Ron SANTOS, Aya SIBLINI, Xin WANG
  • Publication number: 20220334683
    Abstract: The present disclosure generally relates to methods and user interfaces for managing visual content at a computer system. In some embodiments, methods and user interfaces for managing visual content in media are described. In some embodiments, methods and user interfaces for managing visual indicators for visual content in media are described. In some embodiments, methods and user interfaces for inserting visual content in media are described. In some embodiments, methods and user interfaces for identifying visual content in media are described. In some embodiments, methods and user interfaces for translating visual content in media are described.
    Type: Application
    Filed: September 24, 2021
    Publication date: October 20, 2022
    Inventors: Grant PAUL, Guillaume BORIOS, Adam H. BRADFORD, Jenny CHEN, Thomas DESELAERS, Ryan S. DIXON, James N. JONES, Behkish J. MANZARI, Viktor MILADINOV, Aya SIBLINI, Andre SOUZA DOS SANTOS, Siyang TANG, Xin WANG, Guangyu ZHONG
  • Publication number: 20220337741
    Abstract: The present disclosure generally relates to methods and user interfaces for managing visual content at a computer system. In some embodiments, methods and user interfaces for managing visual content in media are described. In some embodiments, methods and user interfaces for managing visual indicators for visual content in media are described. In some embodiments, methods and user interfaces for inserting visual content in media are described. In some embodiments, methods and user interfaces for identifying visual content in media are described. In some embodiments, methods and user interfaces for translating visual content in media are described.
    Type: Application
    Filed: September 24, 2021
    Publication date: October 20, 2022
    Inventors: Grant PAUL, Francisco ALVARO MUNOZ, Jeffrey A. BRASKET, Brandon J. Corey, Thomas DESELAERS, Nathan DE VRIES, Ryan S. DIXON, Craig M. FEDERIGHI, Vignesh JAGADEESH, James N. JONES, Nicholas Lupinetti, Behkish J. MANZARI, Ron SANTOS, Vinay SHARMA, Xin WANG, Xishuo Lili, Maeco Zuliani