Patents by Inventor Jan FREYBERG

Jan FREYBERG has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230260652
    Abstract: Systems and methods can perform self-supervised machine learning for improved medical image analysis. As one example, self-supervised learning on ImageNet, followed by additional self-supervised learning on unlabeled medical images from the target domain of interest, followed by fine-tuning on labeled medical images from the target domain significantly improves the accuracy of medical image classifiers such as, for example diagnostic models. Another example aspect of the present disclosure is directed to a novel Multi-Instance Contrastive Learning (MICLe) method that uses multiple different medical images that share one or more attributes (e.g., multiple images that depict the same underlying pathology and/or the same patient) to construct more informative positive pairs for self-supervised learning.
    Type: Application
    Filed: December 10, 2021
    Publication date: August 17, 2023
    Inventors: Shekoofeh Azizi, Wen Yau Aaron Loh, Zachary William Beaver, Ting Chen, Jonathan Paul Deaton, Jan Freyberg, Alan Prasana Karthikesalingam, Simon Kornblith, Basil Mustafa, Mohammad Norouzi, Vivek Natarajan, Fiona Keleher Ryan
  • Patent number: 11727285
    Abstract: A method and system for managing a dataset. An artificial intelligence (AI) model is to be used on the dataset. A data mask describes a labeling status of the data items. A loop is repeated until patience parameters are satisfied. The loop comprises receiving trusted labels provided by trusted labelers; updating the data mask; from a labelled data items subset, training the AI model; cloning the trained AI model into a local AI model on processing nodes; creating and chunking a randomized unlabeled subset into data subsets for dispatching to the processing nodes; receiving an indication that predicted label answers have been inferred by the processing nodes using the local AI model; computing a model uncertainty measurement from statistical analysis of the predicted label answers. The patience parameters include one or more of a threshold value on the model uncertainty measurement and information gain between different training cycles.
    Type: Grant
    Filed: January 31, 2020
    Date of Patent: August 15, 2023
    Assignee: ServiceNow Canada Inc.
    Inventors: Frédéric Branchaud-Charron, Parmida Atighehchian, Jan Freyberg, Lorne Schell
  • Patent number: 11537886
    Abstract: A method and server for optimizing hyperparameter tuples for training production-grade artificial intelligence (AI) models. For each one of the AI models, AI model features are extracted and, for the one AI model, an initial distribution of n hyperparameter tuplesis created considering the extracted AI model features therefor. A loop is repeated, until metric parameters are satisfied, comprising: evaluating latency from training the one AI model for each of the n hyperparameters tuples; evaluating model uncertainty from training the one AI model for each of the n hyperparameters tuples; for each of the n hyperparameters tuples, computing a blended quality measurement from the evaluated latency and evaluated model uncertainty; replacing m hyperparameter tuples having the worst blended quality measurements with m newly generated hyperparameter tuples. The metric parameters include one or more of a threshold value on model uncertainty and blended quality measurement gain between successive loops.
    Type: Grant
    Filed: January 31, 2020
    Date of Patent: December 27, 2022
    Assignee: SERVICENOW CANADA INC.
    Inventors: Frédéric Branchaud-Charron, Parmida Atighehchian, Jan Freyberg, Lorne Schell
  • Publication number: 20220189612
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network to perform a downstream computer vision task. One of the methods includes pre-training an initial neural network that shares layers with the neural network to perform an initial computer vision task and then training the neural network on the downstream computer vision task.
    Type: Application
    Filed: December 14, 2021
    Publication date: June 16, 2022
    Inventors: Xiaohua Zhai, Sylvain Gelly, Alexander Kolesnikov, Yin Ching Jessica Yung, Joan Puigcerver i Perez, Lucas Klaus Beyer, Neil Matthew Tinmouth Houlsby, Wen Yau Aaron Loh, Alan Prasana Karthikesalingam, Basil Mustafa, Jan Freyberg, Patricia Leigh MacWilliams, Vivek Natarajan
  • Publication number: 20210241165
    Abstract: A method and server for optimizing hyperparameter tuples for training production-grade artificial intelligence (AI) models. For each one of the AI models, AI model features are extracted and, for the one AI model, an initial distribution of n hyperparameter tuplesis created considering the extracted AI model features therefor. A loop is repeated, until metric parameters are satisfied, comprising: evaluating latency from training the one AI model for each of the n hyperparameters tuples; evaluating model uncertainty from training the one AI model for each of the n hyperparameters tuples; for each of the n hyperparameters tuples, computing a blended quality measurement from the evaluated latency and evaluated model uncertainty; replacing m hyperparameter tuples having the worst blended quality measurements with m newly generated hyperparameter tuples. The metric parameters include one or more of a threshold value on model uncertainty and blended quality measurement gain between successive loops.
    Type: Application
    Filed: January 31, 2020
    Publication date: August 5, 2021
    Inventors: Frédéric BRANCHAUD-CHARRON, Parmida ATIGHEHCHIAN, Jan FREYBERG, Lorne SCHELL
  • Publication number: 20210241153
    Abstract: A method and a server for updating a dynamic list of labeling tasks. One or more labels are received, each label associated to one labeling task; the one or more received labels are inserted into a dataset; an artificial intelligence (AI) model is trained on labeled data items from the dataset; predicted labels are obtained for a plurality of unlabeled data items from the dataset by applying the model thereon; a model-uncertainty measurement is computed by applying one or more regularization methods; relevancy values are computed for at least a subset of the predicted labels taking into account the predicted label and the model-uncertainty measurement; the data items corresponding to the labeling tasks with the highest relevancy values are inserted in the dynamic list; and the dynamic list is reordered upon computing of the relevancy values.
    Type: Application
    Filed: January 31, 2020
    Publication date: August 5, 2021
    Inventors: Frédéric BRANCHAUD-CHARRON, Parmida ATIGHEHCHIAN, Jan FREYBERG, Lorne SCHELL
  • Publication number: 20210241135
    Abstract: A method and system for managing a dataset. An artificial intelligence (AI) model is to be used on the dataset. A data mask describes a labeling status of the data items. A loop is repeated until patience parameters are satisfied. The loop comprises receiving trusted labels provided by trusted labelers; updating the data mask; from a labelled data items subset, training the AI model; cloning the trained AI model into a local AI model on processing nodes; creating and chunking a randomized unlabeled subset into data subsets for dispatching to the processing nodes; receiving an indication that predicted label answers have been inferred by the processing nodes using the local AI model; computing a model uncertainty measurement from statistical analysis of the predicted label answers. The patience parameters include one or more of a threshold value on the model uncertainty measurement and information gain between different training cycles.
    Type: Application
    Filed: January 31, 2020
    Publication date: August 5, 2021
    Inventors: Frédéric BRANCHAUD-CHARRON, Parmida ATIGHEHCHIAN, Jan FREYBERG, Lorne SCHELL