Patents by Inventor Brendan Leigh Ross

Brendan Leigh Ross has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230385694
    Abstract: Model training systems collaborate on model training without revealing respective private data sets. Each private data set learns a set of client weights for a set of computer models that are also learned during training. Inference for a particular private data set is determined as a mixture of the computer model parameters according to the client weights. During training, at each iteration, the client weights are updated in one step based on how well sampled models represent the private data set. In another step, gradients are determined for each sampled model and may be weighed according to the client weight for that model, relatively increasing the gradient contribution of a private data set for model parameters that correspond more highly to that private data set.
    Type: Application
    Filed: May 26, 2023
    Publication date: November 30, 2023
    Inventors: Jesse Cole Cresswell, Brendan Leigh Ross, Ka Ho Yenson Lau, Junfeng Wen, Yi Sui
  • Publication number: 20230385693
    Abstract: Probability density modeling, such as for generative modeling, for data on a manifold of a high-dimensional space is performed with an implicitly-defined manifold such that points belonging to the manifold is the zero set of a manifold-defining function. An energy function is trained to learn an energy function that, evaluated on the manifold, describes a probability density for the manifold. As such, the relevant portions of the energy function are “filtered through” the defined manifold for training and in application. The combined energy function and manifold-defining function provide an “energy-based implicit manifold” that can more effectively model probability densities of a manifold in the high-dimensional space. As the manifold-defining function and the energy function are defined across the high-dimensional space, they may more effectively learn geometries and avoid distortions due to change in dimension that occur for models that model the manifold in a lower-dimensional space.
    Type: Application
    Filed: May 26, 2023
    Publication date: November 30, 2023
    Applicant: THE TORONTO-DOMINION BANK
    Inventors: Jesse Cole Cresswell, Brendan Leigh Ross, Anthony Lawrence Caterini, Gabriel Loaiza Ganem
  • Publication number: 20230386190
    Abstract: A computer model is trained to account for data samples in a high-dimensional space as lying on different manifolds, rather than a single manifold to represent the data set, accounting for the data set as a whole as a union of manifolds. Different data samples that may be expected to belong to the same underlying manifold are determined by grouping the data. For generative models, a generative model may be trained that includes a sub-model for each group trained on that group's data samples, such that each sub-model can account for the manifold of that group. The overall generative model includes information describing the frequency to sample from each sub-model to correctly represent the data set as a whole in sampling. Multi-class classification models may also use the grouping to improve classification accuracy by weighing group data samples according to the estimated latent dimensionality of the group.
    Type: Application
    Filed: May 26, 2023
    Publication date: November 30, 2023
    Inventors: Jesse Cole Cresswell, Brendan Leigh Ross, Anthony Lawrence Caterini, Gabriel Loaiza Ganem, Bradley Craig Anderson Brown
  • Publication number: 20230244917
    Abstract: To effectively learn a probability density from a data set in a high-dimensional space without manifold overfitting, a computer model first learns an autoencoder model that can transform data from a high-dimensional space to a low-dimensional space, and then learns a probability density model that may be effectively learned with maximum-likelihood. By separating these components, different types of models can be employed for each portion (e.g., manifold learning and density learning) and permits effective modeling of high-dimensional data sets that lie along a manifold representable with fewer dimensions, thus effectively learning both the density and the manifold and permitting effective data generation and density estimation.
    Type: Application
    Filed: December 16, 2022
    Publication date: August 3, 2023
    Inventors: Gabriel Loaiza Ganem, Brendan Leigh Ross, Jesse Cole Cresswell, Anthony Lawrence Caterini
  • Publication number: 20230004694
    Abstract: A computer models a high-dimensional data with a low-dimensional manifold in conjunction with a low-dimensional base probability density. A first transform (a manifold transform) may be used to transform the high-dimensional data to a low-dimensional manifold, and a second transform (a density transform) may be used to transform the low-dimensional manifold to a low-dimensional probability distribution. To enable the model to tractably learn the manifold transformation from the high-dimensional to low-dimensional spaces, the manifold transformation includes conformal flows, which simplify the probabilistic volume transform and enables tractable learning of the transform. This may also allow the manifold transform to be jointly learned with density transform.
    Type: Application
    Filed: May 3, 2022
    Publication date: January 5, 2023
    Inventors: Brendan Leigh Ross, Jesse Cole Cresswell