Patents by Inventor Frédéric BRANCHAUD-CHARRON

Frédéric BRANCHAUD-CHARRON has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11727285
    Abstract: A method and system for managing a dataset. An artificial intelligence (AI) model is to be used on the dataset. A data mask describes a labeling status of the data items. A loop is repeated until patience parameters are satisfied. The loop comprises receiving trusted labels provided by trusted labelers; updating the data mask; from a labelled data items subset, training the AI model; cloning the trained AI model into a local AI model on processing nodes; creating and chunking a randomized unlabeled subset into data subsets for dispatching to the processing nodes; receiving an indication that predicted label answers have been inferred by the processing nodes using the local AI model; computing a model uncertainty measurement from statistical analysis of the predicted label answers. The patience parameters include one or more of a threshold value on the model uncertainty measurement and information gain between different training cycles.
    Type: Grant
    Filed: January 31, 2020
    Date of Patent: August 15, 2023
    Assignee: ServiceNow Canada Inc.
    Inventors: Frédéric Branchaud-Charron, Parmida Atighehchian, Jan Freyberg, Lorne Schell
  • Publication number: 20230111047
    Abstract: Persistent storage contains a training dataset and a test dataset, each with units of text labelled from a plurality of categories. A machine learning model has been trained with the training dataset to classify input units of text into the plurality of categories. One or more processors are configured to: read the training dataset or the test dataset; determine distributional properties of the training dataset or the test dataset; determine, using the machine learning model, saliency maps for tokens in the training dataset or the test dataset; perturb, by way of token insertion, token deletion, or token replacement, the training dataset or the test dataset into an expanded dataset; obtain, using the machine learning model, classifications into the plurality of categories for the expanded dataset; and based on the distributional properties, the saliency maps, and the classifications, identify causes of failure for the machine learning model.
    Type: Application
    Filed: October 13, 2021
    Publication date: April 13, 2023
    Inventors: Lindsay Devon Brin, Joseph Béchard Marinier, Uyen Diana Vu Le, Christopher John Tyler, Parmida Atighehchian, Gabrielle Gauthier-Melançon, Frédéric Branchaud-Charron, Orlando Marquez Ayala
  • Patent number: 11537886
    Abstract: A method and server for optimizing hyperparameter tuples for training production-grade artificial intelligence (AI) models. For each one of the AI models, AI model features are extracted and, for the one AI model, an initial distribution of n hyperparameter tuplesis created considering the extracted AI model features therefor. A loop is repeated, until metric parameters are satisfied, comprising: evaluating latency from training the one AI model for each of the n hyperparameters tuples; evaluating model uncertainty from training the one AI model for each of the n hyperparameters tuples; for each of the n hyperparameters tuples, computing a blended quality measurement from the evaluated latency and evaluated model uncertainty; replacing m hyperparameter tuples having the worst blended quality measurements with m newly generated hyperparameter tuples. The metric parameters include one or more of a threshold value on model uncertainty and blended quality measurement gain between successive loops.
    Type: Grant
    Filed: January 31, 2020
    Date of Patent: December 27, 2022
    Assignee: SERVICENOW CANADA INC.
    Inventors: Frédéric Branchaud-Charron, Parmida Atighehchian, Jan Freyberg, Lorne Schell
  • Publication number: 20210241153
    Abstract: A method and a server for updating a dynamic list of labeling tasks. One or more labels are received, each label associated to one labeling task; the one or more received labels are inserted into a dataset; an artificial intelligence (AI) model is trained on labeled data items from the dataset; predicted labels are obtained for a plurality of unlabeled data items from the dataset by applying the model thereon; a model-uncertainty measurement is computed by applying one or more regularization methods; relevancy values are computed for at least a subset of the predicted labels taking into account the predicted label and the model-uncertainty measurement; the data items corresponding to the labeling tasks with the highest relevancy values are inserted in the dynamic list; and the dynamic list is reordered upon computing of the relevancy values.
    Type: Application
    Filed: January 31, 2020
    Publication date: August 5, 2021
    Inventors: Frédéric BRANCHAUD-CHARRON, Parmida ATIGHEHCHIAN, Jan FREYBERG, Lorne SCHELL
  • Publication number: 20210241135
    Abstract: A method and system for managing a dataset. An artificial intelligence (AI) model is to be used on the dataset. A data mask describes a labeling status of the data items. A loop is repeated until patience parameters are satisfied. The loop comprises receiving trusted labels provided by trusted labelers; updating the data mask; from a labelled data items subset, training the AI model; cloning the trained AI model into a local AI model on processing nodes; creating and chunking a randomized unlabeled subset into data subsets for dispatching to the processing nodes; receiving an indication that predicted label answers have been inferred by the processing nodes using the local AI model; computing a model uncertainty measurement from statistical analysis of the predicted label answers. The patience parameters include one or more of a threshold value on the model uncertainty measurement and information gain between different training cycles.
    Type: Application
    Filed: January 31, 2020
    Publication date: August 5, 2021
    Inventors: Frédéric BRANCHAUD-CHARRON, Parmida ATIGHEHCHIAN, Jan FREYBERG, Lorne SCHELL
  • Publication number: 20210241165
    Abstract: A method and server for optimizing hyperparameter tuples for training production-grade artificial intelligence (AI) models. For each one of the AI models, AI model features are extracted and, for the one AI model, an initial distribution of n hyperparameter tuplesis created considering the extracted AI model features therefor. A loop is repeated, until metric parameters are satisfied, comprising: evaluating latency from training the one AI model for each of the n hyperparameters tuples; evaluating model uncertainty from training the one AI model for each of the n hyperparameters tuples; for each of the n hyperparameters tuples, computing a blended quality measurement from the evaluated latency and evaluated model uncertainty; replacing m hyperparameter tuples having the worst blended quality measurements with m newly generated hyperparameter tuples. The metric parameters include one or more of a threshold value on model uncertainty and blended quality measurement gain between successive loops.
    Type: Application
    Filed: January 31, 2020
    Publication date: August 5, 2021
    Inventors: Frédéric BRANCHAUD-CHARRON, Parmida ATIGHEHCHIAN, Jan FREYBERG, Lorne SCHELL