Patents by Inventor Jesse Cole CRESSWELL

Jesse Cole CRESSWELL has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230385694
    Abstract: Model training systems collaborate on model training without revealing respective private data sets. Each private data set learns a set of client weights for a set of computer models that are also learned during training. Inference for a particular private data set is determined as a mixture of the computer model parameters according to the client weights. During training, at each iteration, the client weights are updated in one step based on how well sampled models represent the private data set. In another step, gradients are determined for each sampled model and may be weighed according to the client weight for that model, relatively increasing the gradient contribution of a private data set for model parameters that correspond more highly to that private data set.
    Type: Application
    Filed: May 26, 2023
    Publication date: November 30, 2023
    Inventors: Jesse Cole Cresswell, Brendan Leigh Ross, Ka Ho Yenson Lau, Junfeng Wen, Yi Sui
  • Publication number: 20230386190
    Abstract: A computer model is trained to account for data samples in a high-dimensional space as lying on different manifolds, rather than a single manifold to represent the data set, accounting for the data set as a whole as a union of manifolds. Different data samples that may be expected to belong to the same underlying manifold are determined by grouping the data. For generative models, a generative model may be trained that includes a sub-model for each group trained on that group's data samples, such that each sub-model can account for the manifold of that group. The overall generative model includes information describing the frequency to sample from each sub-model to correctly represent the data set as a whole in sampling. Multi-class classification models may also use the grouping to improve classification accuracy by weighing group data samples according to the estimated latent dimensionality of the group.
    Type: Application
    Filed: May 26, 2023
    Publication date: November 30, 2023
    Inventors: Jesse Cole Cresswell, Brendan Leigh Ross, Anthony Lawrence Caterini, Gabriel Loaiza Ganem, Bradley Craig Anderson Brown
  • Publication number: 20230385444
    Abstract: A model evaluation system evaluates the extent to which privacy-aware training processes affect the direction of training gradients for groups. A modified differential-privacy (“DP”) training process provides per-sample gradient adjustments with parameters that may be adaptively modified for different data batches. Per-sample gradients are modified with respect to a reference bound and a clipping bound. A scaling factor may be determined for each per-sample gradient based on the higher of the reference bound or a magnitude of the per-sample gradient. Per-sample gradients may then be adjusted based on a ratio of the clipping bound to the scaling factor. A relative privacy cost between groups may be determined as excess training risk based on a difference in group gradient direction relative to an unadjusted batch gradient and the adjusted batch gradient according to the privacy-aware training.
    Type: Application
    Filed: May 26, 2023
    Publication date: November 30, 2023
    Applicant: THE TORONTO-DOMINION BANK
    Inventors: Jesse Cole Cresswell, Atiyeh Ashari Ghomi, Yaqiao Luo, Maria Esipova
  • Publication number: 20230385443
    Abstract: A model evaluation system evaluates the extent to which privacy-aware training processes affect the direction of training gradients for groups. A modified differential-privacy (“DP”) training process provides per-sample gradient adjustments with parameters that may be adaptively modified for different data batches. Per-sample gradients are modified with respect to a reference bound and a clipping bound. A scaling factor may be determined for each per-sample gradient based on the higher of the reference bound or a magnitude of the per-sample gradient. Per-sample gradients may then be adjusted based on a ratio of the clipping bound to the scaling factor. A relative privacy cost between groups may be determined as excess training risk based on a difference in group gradient direction relative to an unadjusted batch gradient and the adjusted batch gradient according to the privacy-aware training.
    Type: Application
    Filed: May 26, 2023
    Publication date: November 30, 2023
    Inventors: Jesse Cole Cresswell, Atiyeh Ashari Ghomi, Yaqiao Luo, Maria Esipova
  • Publication number: 20230385693
    Abstract: Probability density modeling, such as for generative modeling, for data on a manifold of a high-dimensional space is performed with an implicitly-defined manifold such that points belonging to the manifold is the zero set of a manifold-defining function. An energy function is trained to learn an energy function that, evaluated on the manifold, describes a probability density for the manifold. As such, the relevant portions of the energy function are “filtered through” the defined manifold for training and in application. The combined energy function and manifold-defining function provide an “energy-based implicit manifold” that can more effectively model probability densities of a manifold in the high-dimensional space. As the manifold-defining function and the energy function are defined across the high-dimensional space, they may more effectively learn geometries and avoid distortions due to change in dimension that occur for models that model the manifold in a lower-dimensional space.
    Type: Application
    Filed: May 26, 2023
    Publication date: November 30, 2023
    Applicant: THE TORONTO-DOMINION BANK
    Inventors: Jesse Cole Cresswell, Brendan Leigh Ross, Anthony Lawrence Caterini, Gabriel Loaiza Ganem
  • Publication number: 20230244917
    Abstract: To effectively learn a probability density from a data set in a high-dimensional space without manifold overfitting, a computer model first learns an autoencoder model that can transform data from a high-dimensional space to a low-dimensional space, and then learns a probability density model that may be effectively learned with maximum-likelihood. By separating these components, different types of models can be employed for each portion (e.g., manifold learning and density learning) and permits effective modeling of high-dimensional data sets that lie along a manifold representable with fewer dimensions, thus effectively learning both the density and the manifold and permitting effective data generation and density estimation.
    Type: Application
    Filed: December 16, 2022
    Publication date: August 3, 2023
    Inventors: Gabriel Loaiza Ganem, Brendan Leigh Ross, Jesse Cole Cresswell, Anthony Lawrence Caterini
  • Publication number: 20230153461
    Abstract: A model training system protects data leakage of private data in a federated learning environment by training a private model in conjunction with a proxy model. The proxy model is trained with protections for the private data and may be shared with other participants. Proxy models from other participants are used to train the private model, enabling the private model to benefit from parameters based on other models’ private data without privacy leakage. The proxy model may be trained with a differentially private algorithm that quantifies a privacy cost for the proxy model, enabling a participant to measure the potential exposure of private data and drop out. Iterations may include training the proxy and private models and then mixing the proxy models with other participants. The mixing may include updating and applying a bias to account for the weights of other participants in the received proxy models.
    Type: Application
    Filed: November 15, 2022
    Publication date: May 18, 2023
    Inventors: Shivam Kalra, Jesse Cole Cresswell, Junfeng Wen, Maksims Volkovs, Hamid R. Tizhoosh
  • Publication number: 20230103753
    Abstract: The disclosed embodiments include computer-implemented processes that generate adaptive textual explanations of output using trained artificial intelligence processes. For example, an apparatus may generate an input dataset based on elements of first interaction data associated with a first temporal interval, and based on an application of a trained artificial intelligence process to the input dataset, generate output data representative of a predicted likelihood of an occurrence of an event during a second temporal interval. Further, and based on an application of a trained explainability process to the input dataset, the apparatus may generate an element of textual content that characterizes an outcome associated with the predicted likelihood of the occurrence of the event, where the element of textual content is associated with a feature value of the input dataset. The apparatus may also transmit a portion of the output data and the element of textual content to a computing system.
    Type: Application
    Filed: November 23, 2021
    Publication date: April 6, 2023
    Inventors: Yaqiao Luo, Jesse Cole Cresswell, Kin Kwan Leung, Kai Wang, Aiyeh Ashari Ghomi, Caitlin Messick, Lu Shu, Barum Rho, Maksims Volkovs, Paige Elyse Dickie
  • Publication number: 20230004694
    Abstract: A computer models a high-dimensional data with a low-dimensional manifold in conjunction with a low-dimensional base probability density. A first transform (a manifold transform) may be used to transform the high-dimensional data to a low-dimensional manifold, and a second transform (a density transform) may be used to transform the low-dimensional manifold to a low-dimensional probability distribution. To enable the model to tractably learn the manifold transformation from the high-dimensional to low-dimensional spaces, the manifold transformation includes conformal flows, which simplify the probabilistic volume transform and enables tractable learning of the transform. This may also allow the manifold transform to be jointly learned with density transform.
    Type: Application
    Filed: May 3, 2022
    Publication date: January 5, 2023
    Inventors: Brendan Leigh Ross, Jesse Cole Cresswell
  • Publication number: 20220207606
    Abstract: The disclosed embodiments include computer-implemented apparatuses and processes that dynamically predict future occurrences of events using adaptively trained machine-learning or artificial-intelligence processes. For example, an apparatus may generate an input dataset based on first interaction data associated with a prior temporal interval, and may apply an adaptively trained, gradient-boosted, decision-tree process to the input dataset. Based on the application of the adaptively trained, gradient-boosted, decision-tree process to the input dataset, the apparatus may generate output data representative of a predicted likelihood of an occurrence of an event during a future temporal interval, which may be separated from the prior temporal interval by a corresponding buffer interval. The apparatus may also transmit a portion of the generated output data to a computing system, and the computing system may be configured to generate or modify second interaction data based on the portion of the output data.
    Type: Application
    Filed: February 20, 2021
    Publication date: June 30, 2022
    Inventors: Paige Elyse DICKIE, Jesse Cole CRESSWELL, Satya Krishna GORTI, Jianjin DONG, Mohammad RAZA, Christopher Patrick CAROTHERS, Tomi Johan POUTANEN, Maksims VOLKOVS