Patents by Inventor Sean M. Colby

Sean M. Colby has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11587646
    Abstract: A variational autoencoder (VAE) has been developed to learn a continuous numerical, or latent, representation of molecular structure to expand reference libraries for small molecule identification. The VAE has been extended to include a chemical property decoder, trained as a multitask network, to shape the latent representation such that it assembles according to desired chemical properties. The approach is unique in its application to metabolomics and small molecule identification, focused on properties that are obtained from experimental measurements (m/z, CCS) paired with its training paradigm, which involves a cascade of transfer learning iterations. First, molecular representation is learned from a large dataset of structures with m/z labels. Next, in silico property values are used to continue training. Finally, the network is further refined by being trained with the experimental data.
    Type: Grant
    Filed: December 3, 2019
    Date of Patent: February 21, 2023
    Assignee: Battelle Memorial Institute
    Inventors: Sean M. Colby, Ryan S. Renslow
  • Publication number: 20200176087
    Abstract: A variational autoencoder (VAE) has been developed to learn a continuous numerical, or latent, representation of molecular structure to expand reference libraries for small molecule identification. The VAE has been extended to include a chemical property decoder, trained as a multitask network, to shape the latent representation such that it assembles according to desired chemical properties. The approach is unique in its application to metabolomics and small molecule identification, focused on properties that are obtained from experimental measurements (m/z, CCS) paired with its training paradigm, which involves a cascade of transfer learning iterations. First, molecular representation is learned from a large dataset of structures with m/z labels. Next, in silico property values are used to continue training. Finally, the network is further refined by being trained with the experimental data.
    Type: Application
    Filed: December 3, 2019
    Publication date: June 4, 2020
    Inventors: Sean M. Colby, Ryan S. Renslow