Patents by Inventor Tim Gasser

Tim Gasser has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12386667
    Abstract: The present disclosure relates to systems, non-transitory computer-readable media, and methods for selecting machine-learning models and hardware environments for executing a task. In particular, in one or more embodiments, the disclosed systems select a designated machine-learning model for executing a task based on workload features of the task and task routing metrics for a plurality of machine-learning models. In addition, in one or more embodiments, the disclosed systems select a designated hardware environment for executing the task based on workload features for the task and task routing metrics for a plurality of hardware environments. In some embodiments, the disclosed systems select a fallback machine-learning model and a fallback hardware environment for executing the task if the designated machine-learning model or designated hardware environment are unavailable. Moreover, in one or more embodiments, the disclosed systems can pause and initiate tasks based on bandwidth availability.
    Type: Grant
    Filed: June 3, 2024
    Date of Patent: August 12, 2025
    Assignee: Dropbox, Inc.
    Inventors: Ashok Pancily Poothiyot, Ali Zafar, Anthony Penta, Stephen Voorhees, Tim Gasser, Tsung-Hsiang Chang, Geoff Hulten
  • Patent number: 12373506
    Abstract: The present disclosure relates to systems, non-transitory computer-readable media, and methods for generating personal responses through retrieval-augmented generation. In particular, the disclosed systems can generate a query embedding from a query generated by an entity and determine data context specific to the entity by comparing the query embedding with a plurality of vectorized segments of content items associated with the entity. The disclosed systems can provide the data context to a large language model and generate a personalized response informed by the data context. Subsequently, the disclosed systems can provide the personalized response for display on a client device associated with the entity.
    Type: Grant
    Filed: June 14, 2024
    Date of Patent: July 29, 2025
    Assignee: Dropbox, Inc.
    Inventors: Anthony Penta, Ashok Pancily Poothiyot, Geoff Hulten, Ameya Bhatawdekar, Tim Gasser, Sateesh Srinivasan, Vasanth Krishna Namasivayam
  • Publication number: 20250238264
    Abstract: The present disclosure relates to systems, non-transitory computer-readable media, and methods for selecting machine-learning models and hardware environments for executing a task. In particular, in one or more embodiments, the disclosed systems select a designated machine-learning model for executing a task based on workload features of the task and task routing metrics for a plurality of machine-learning models. In addition, in one or more embodiments, the disclosed systems select a designated hardware environment for executing the task based on workload features for the task and task routing metrics for a plurality of hardware environments. In some embodiments, the disclosed systems select a fallback machine-learning model and a fallback hardware environment for executing the task if the designated machine-learning model or designated hardware environment are unavailable. Moreover, in one or more embodiments, the disclosed systems can pause and initiate tasks based on bandwidth availability.
    Type: Application
    Filed: June 3, 2024
    Publication date: July 24, 2025
    Inventors: Ashok Pancily Poothiyot, Ali Zafar, Anthony Penta, Stephen Voorhees, Tim Gasser, Tsung-Hsiang Chang, Geoff Hulten
  • Publication number: 20250238265
    Abstract: The present disclosure relates to systems, non-transitory computer-readable media, and methods for selecting machine-learning models and hardware environments for executing a task. In particular, in one or more embodiments, the disclosed systems select a designated machine-learning model for executing a task based on workload features of the task and task routing metrics for a plurality of machine-learning models. In addition, in one or more embodiments, the disclosed systems select a designated hardware environment for executing the task based on workload features for the task and task routing metrics for a plurality of hardware environments. In some embodiments, the disclosed systems select a fallback machine-learning model and a fallback hardware environment for executing the task if the designated machine-learning model or designated hardware environment are unavailable. Moreover, in one or more embodiments, the disclosed systems can pause and initiate tasks based on bandwidth availability.
    Type: Application
    Filed: June 3, 2024
    Publication date: July 24, 2025
    Inventors: Ashok Pancily Poothiyot, Ali Zafar, Anthony Penta, Stephen Voorhees, Tim Gasser, Tsung-Hsiang Chang, Geoff Hulten
  • Publication number: 20250238470
    Abstract: The present disclosure relates to systems, non-transitory computer-readable media, and methods for generating personal responses through retrieval-augmented generation. In particular, the disclosed systems can generate a query embedding from a query generated by an entity and determine data context specific to the entity by comparing the query embedding with a plurality of vectorized segments of content items associated with the entity. The disclosed systems can provide the data context to a large language model and generate a personalized response informed by the data context. Subsequently, the disclosed systems can provide the personalized response for display on a client device associated with the entity.
    Type: Application
    Filed: June 14, 2024
    Publication date: July 24, 2025
    Inventors: Anthony Penta, Ashok Pancily Poothiyot, Geoff Hulten, Ameya Bhatawdekar, Tim Gasser, Sateesh Srinivasan, Vasanth Krishna Namasivayam
  • Publication number: 20250240220
    Abstract: The present disclosure relates to systems, non-transitory computer-readable media, and methods for selecting machine-learning models and hardware environments for executing a task. In particular, in one or more embodiments, the disclosed systems select a designated machine-learning model for executing a task based on workload features of the task and task routing metrics for a plurality of machine-learning models. In addition, in one or more embodiments, the disclosed systems select a designated hardware environment for executing the task based on workload features for the task and task routing metrics for a plurality of hardware environments. In some embodiments, the disclosed systems select a fallback machine-learning model and a fallback hardware environment for executing the task if the designated machine-learning model or designated hardware environment are unavailable. Moreover, in one or more embodiments, the disclosed systems can pause and initiate tasks based on bandwidth availability.
    Type: Application
    Filed: June 3, 2024
    Publication date: July 24, 2025
    Inventors: Ashok Pancily Poothiyot, Ali Zafar, Anthony Penta, Stephen Voorhees, Tim Gasser, Tsung-Hsiang Chang, Geoff Hulten
  • Patent number: 8354990
    Abstract: In one embodiment of the present invention, a drive circuit includes: a logic block connected between a source of a first voltage and a source of a second voltage, and a sampler including a plurality of sampling circuits. Each sampling circuit is for sampling, in use, an input data signal and outputting a voltage to a respective output. The drive circuit further includes a voltage booster having plurality of voltage boost circuits, each voltage boost circuit being associated with a respective one of the sampling circuits and, in use, generating a boosted voltage signal and providing the boosted voltage signal to the respective sampling circuit. Each voltage boost circuit is connected between the source of the first voltage and the source of the second voltage. The logic block may be, but is not limited to, a shift register.
    Type: Grant
    Filed: January 29, 2007
    Date of Patent: January 15, 2013
    Assignee: Sharp Kabushiki Kaisha
    Inventors: Gareth John, Patrick Zebedee, Michael James Brownlow, Tim Gasser, Jeremy Lock, Graham Andrew Cairns, Jaganath Rajendra, Harry Garth Walton
  • Publication number: 20090002357
    Abstract: In one embodiment of the present invention, a drive circuit includes: a logic block connected between a source of a first voltage and a source of a second voltage, and a sampler including a plurality of sampling circuits. Each sampling circuit is for sampling, in use, an input data signal and outputting a voltage to a respective output. The drive circuit further includes a voltage booster having plurality of voltage boost circuits, each voltage boost circuit being associated with a respective one of the sampling circuits and, in use, generating a boosted voltage signal and providing the boosted voltage signal to the respective sampling circuit. Each voltage boost circuit is connected between the source of the first voltage and the source of the second voltage. The logic block may be, but is not limited to, a shift register.
    Type: Application
    Filed: January 29, 2007
    Publication date: January 1, 2009
    Inventors: Gareth John, Patrick Zebedee, Michael James Brownlow, Tim Gasser, Jeremy Lock, Graham Andrew Cairns, Jaganath Rajendra, Harry Garth Walton