Patents by Inventor Brian Lawrence Hill

Brian Lawrence Hill has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250259050
    Abstract: Various embodiments of the present disclosure provide methods, apparatus, systems, computing devices, computing entities, and/or the like for receiving training data comprising data records with identified presence of modalities, training a multi-modal generative model based on the training data, and imputing missing modalities of input data records using the multi-modal generative model, wherein the multi-modal generative model comprises (i) a modality-agonistic latent variable encoder and (ii) one or more modality-specific latent variable encoders configured to receive output of the modality-agonistic latent variable encoder as input.
    Type: Application
    Filed: February 8, 2024
    Publication date: August 14, 2025
    Inventors: Brian Lawrence Hill, Josue Nassar, Robert Elliott Tillman, Eran Halperin, Matthew Dowling
  • Publication number: 20250200427
    Abstract: Embodiments provide processing of time-series data for improved embedding and processing, specifically using dimension attention for contrastive learning. The improved embedding enables the creation of more accurate embeddings within an embedding space, including an embedding space shared between the data types, via contrastive learning and dimension attention.
    Type: Application
    Filed: December 15, 2023
    Publication date: June 19, 2025
    Inventors: Brian Lawrence HILL, Eran HALPERIN, Gregory D. LYNG, Kimmo M. KARKKAINEN
  • Publication number: 20250086501
    Abstract: Embodiments of the present disclosure provide for improved data imputation and use of imputed data in processing of downstream models. Some embodiments specially train a model that performs improved data imputation utilizing a specially-configured attention mechanism. Some embodiments train a model utilizing stratified masking. Some embodiments train a particular pre-processing layer of a downstream task-specific model to adaptively learn threshold values for imputing particular data. The pre-processing layer is usable to improve accuracy training and/or use of a downstream task-specific model based at least in part on the imputed data.
    Type: Application
    Filed: December 15, 2023
    Publication date: March 13, 2025
    Inventors: Robert Elliott TILLMAN, Sanjit Singh BATRA, Josue NASSAR, Jun HAN, Eran HALPERIN, Brian Lawrence HILL, Vijay S. NORI
  • Publication number: 20250068903
    Abstract: Various embodiments of the present disclosure provide methods, apparatus, systems, computing devices, computing entities, and/or the like for generating a plurality of training embeddings based on a pre-training dataset, wherein the plurality of training embeddings comprises one or more of descriptive embeddings, sequential ordering embeddings, age/time embeddings, locale embeddings, or encounter number embeddings; generating one or more initialized weights associated with respective one or more layers of a machine learning model based on the plurality of training embeddings; generating one or more fine-tuned weights for the machine learning model by updating at least a portion of the one or more initialized weights using a fine-tuning dataset associated with a target classification; and generating, using the machine learning model, one or more prediction scores for one or more prediction encounter data elements associated with the target classification, based on one or more input temporal sequence of encount
    Type: Application
    Filed: September 28, 2023
    Publication date: February 27, 2025
    Inventors: Robert Elliott Tillman, Brian Lawrence Hill, Vijay S. Nori, Aldo Cordova Palomera, Eran Halperin, Melikasadat Emami
  • Publication number: 20240320546
    Abstract: Various embodiments of the present disclosure provide methods, apparatus, systems, computing devices, computing entities, and/or the like for improving machine learning model training based on receiving labeled training data objects, generating a normal prediction loss parameter, generating a global classification loss parameter, generating a composite loss parameter, and initiating the performance of one or more prediction-based operations.
    Type: Application
    Filed: March 23, 2023
    Publication date: September 26, 2024
    Inventors: Aldo Cordova Palomera, Brian Lawrence Hill, Eran Halperin, Ardavan Saeedi
  • Publication number: 20240170160
    Abstract: Embodiments provide for application of personalized or individualized sensor-based risk profiles for impacts of external events. An example method includes receiving sensor data from one or more sensors couplable with a subject body of a subject population comprising a plurality of subject bodies; receiving external factor data associated with the subject population; generating a population-level external event impact metric, where the population-level external event impact metric represents a predicted impact of one or more external events on a physiological or other metric of the subject population; generating a subject-level external impact metric, where the subject-level external event impact metric represents a predicted impact of the one or more external events on the physiological or other metric associated with the subject body; and initiating the performance of one or more prediction-based actions based on the subject-level external event impact metric.
    Type: Application
    Filed: June 20, 2023
    Publication date: May 23, 2024
    Inventors: Gregory D. Lyng, Brian Lawrence Hill, James Zou, Kimmo M. Karkkainen, Kailas Vodrahalli, Eran Halperin
  • Publication number: 20240169185
    Abstract: Embodiments of the present disclosure provide for improved data processing using interconnected variational autoencoder models, which may be used for any of a myriad of purposes. Some embodiments specially train the interconnected variational autoencoder models by utilizing different training scenarios corresponding to presence and/or absence of particular data in a training data set. Particular encoder(s) and/or decoder(s) from the specially trained interconnected variational autoencoder models may then be utilized to improve accuracy of the desired data processing tasks, for example, to generate particular output data.
    Type: Application
    Filed: August 9, 2023
    Publication date: May 23, 2024
    Inventors: Sanjit S. BATRA, Robert E. TILLMAN, Brian Lawrence Hill, Eran HALPERIN, Josue Ramon NASSAR
  • Publication number: 20240095591
    Abstract: Embodiments of the disclosure provide for improved processing of data with different timescales, for example high-frequency data and low-frequency data. Embodiments specifically improve such processing of different timescale data processed by a machine learning model. Additionally or alternatively, some embodiments include improved processing of data with different timescales by selecting an optimal variant from a plurality of possible variants of a prediction model.
    Type: Application
    Filed: May 30, 2023
    Publication date: March 21, 2024
    Inventors: Gregory D. Lyng, Eran Halperin, Brian Lawrence Hill, Kimmo M. Karkkainen, Kailas Vodrahalli