Patents by Inventor Bruce CLEMENTS

Bruce CLEMENTS has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11656851
    Abstract: The syntax elements of a source code program used to represent the context of a focal method are selected based on a priority order. The selected syntax elements are input into a fixed-size context window that is used to train a neural transformer with attention model to learn to generate source code and used by the neural transformer model to generate source code. The context window contains prioritized sequences of tokens that extend beyond the target focus in order to provide a longer visibility back into the source code program for the model to learn predictive patterns. This gives the model a file-level context of the source code program without increasing the size of the context window.
    Type: Grant
    Filed: October 22, 2021
    Date of Patent: May 23, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Colin Bruce Clement, Neelakantan Sundaresan, Alexey Svyatkovskiy, Michele Tufano
  • Publication number: 20230128008
    Abstract: A test-driven development system utilizes a neural transformer model with attention to generate method bodies for a focal method given its associated test cases, and optionally a method signature and a docstring of the focal method. The candidate method bodies are validated for syntactic correctness, tested using the given test cases, and tested with a donor class in a target system. Those candidate method bodies passing the validation and testing are then ranked based on a PLUM score that analyzes the candidate method bodies against various quality and performance metrics.
    Type: Application
    Filed: October 22, 2021
    Publication date: April 27, 2023
    Inventors: COLIN BRUCE CLEMENT, SHAO KUN DENG, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, MICHELE TUFANO
  • Publication number: 20230128200
    Abstract: The syntax elements of a source code program used to represent the context of a focal method are selected based on a priority order. The selected syntax elements are input into a fixed-size context window that is used to train a neural transformer with attention model to learn to generate source code and used by the neural transformer model to generate source code. The context window contains prioritized sequences of tokens that extend beyond the target focus in order to provide a longer visibility back into the source code program for the model to learn predictive patterns. This gives the model a file-level context of the source code program without increasing the size of the context window.
    Type: Application
    Filed: October 22, 2021
    Publication date: April 27, 2023
    Inventors: COLIN BRUCE CLEMENT, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, MICHELE TUFANO
  • Publication number: 20230115790
    Abstract: The present invention discloses systems and methods for supplying hot water for primary extraction in the oil sands bitumen extraction process. Direct contact process and method for producing hot water using mature fine tailings are provided by employing a double staged submerge arrangement with a thickener vessel containing a submerged fuel burner and a hot water vessel containing a submerged fuel burner, a flash submerged arrangement with a flash concentrator vessel containing or adjacent to a fuel burner and a hot water vessel containing a submerged fuel burner or a triple cascade arrangement with a flash concentrator vessel, a thickener vessel containing a submerged fuel burner and a hot water vessel containing a submerged fuel burner.
    Type: Application
    Filed: October 12, 2022
    Publication date: April 13, 2023
    Inventors: Quan Zhuang, Phil Geddis, Bruce Clements, Brianna Hataley, Mohammad Asiri, Ted Herage, Steven Chen, Lijun Wu
  • Patent number: 11604719
    Abstract: An automated program repair system uses a neural transformer model with attention to predict a bug-free version of a method having a source code bug identified in an associated stack trace. The neural transformer model is pre-trained with English language text and the source code of a target programming language. The pre-trained neural transformer model is trained to create synthetic bugs in bug-free methods. The bug-free methods with the synthetic bugs are executed with a test case to obtain a stack trace of the source code bug. The method with the synthetic bug, without the bug, and its stack trace are used to train the neural transformer model to predict repairs for buggy methods.
    Type: Grant
    Filed: March 25, 2021
    Date of Patent: March 14, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Colin Bruce Clement, Dawn Drain, Guillermo Serrato Castilla, Neelakantan Sundaresan
  • Publication number: 20230067364
    Abstract: A transfer learning system is used for the development of neural transformer models pertaining to software engineering tasks. The transfer learning system trains source code domain neural transformer models with attention in various configurations on a large corpus of unsupervised training dataset of source code programs and/or source code-related natural language text. A web service provides the trained models for use in developing a model that may be fine-tuned on a supervised training dataset associated with a software engineering task thereby generating a tool to perform the software engineering task.
    Type: Application
    Filed: November 6, 2022
    Publication date: March 2, 2023
    Inventors: COLIN BRUCE CLEMENT, DAWN DRAIN, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY
  • Publication number: 20230048186
    Abstract: A neural transformer model with attention is trained to predict candidates to complete a line of source code with a zero-inference capability. The model is trained on an unsupervised training dataset that includes features from source code written in multiple programming languages. The features include a file-level context and a local context, where the file-level context includes a global context, a class context, a function context, and/or a method context for each class, function and/or method of the source code programs used in the training dataset. The local context includes method bodies, function bodies, and/or stand-alone code of main method routines. From these features, the model is able to learn to predict an ordered sequence of code elements that complete a line of source code in a programming language seen and not seen during training.
    Type: Application
    Filed: November 1, 2022
    Publication date: February 16, 2023
    Inventors: COLIN BRUCE CLEMENT, SHUAI LU, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, DUYU TANG
  • Publication number: 20230042051
    Abstract: A distillation system extracts knowledge from a large pre-trained sequence-to-sequence neural transformer model into a smaller bi-encoder. The pre-trained sequence-to-sequence neural transformer model is trained to translate data from a first domain into a second domain on a large corpus. A teacher model is generated from the pre-trained model by fine-tuning the pre-trained neural transformer model on a smaller translation task with true translation pairs. The fine-tuned model is then used to generate augmented data values which are used with the true translation pairs to train the bi-encoder. The bi-encoder is used for perform cross-domain searches.
    Type: Application
    Filed: July 22, 2021
    Publication date: February 9, 2023
    Inventors: COLIN BRUCE CLEMENT, DAWN DRAIN, NEELAKANTAN SUNDARESAN, CHEN WU
  • Publication number: 20220398462
    Abstract: A cloud platform includes several web services that facilitate the automated tuning and deployment of pre-trained deep learning models configured for software engineering tasks. The automated tuning and deployment allow a developer to fine-tune a pre-existing model without having access to the parameters of the pre-existing and the fine-tuned model in a manner that does not require user management input. The cloud platform provides a set of files for each pre-trained models used to automatically build a fine-tuning infrastructure to fine-tune a model and a deployment infrastructure that deploys the fine-tuned model without requiring user input.
    Type: Application
    Filed: June 14, 2021
    Publication date: December 15, 2022
    Inventors: COLIN BRUCE CLEMENT, SHAO KUN DENG, DAWN DRAIN, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, YIDING TIAN, MICHELE TUFANO, PAUL AN-CHIEH WANG, CHEN WU, DONGJIANG YOU
  • Patent number: 11521075
    Abstract: A transfer learning system is used for the development of neural transformer models pertaining to software engineering tasks. The transfer learning system trains source code domain neural transformer models with attention in various configurations on a large corpus of unsupervised training dataset of source code programs and/or source code-related natural language text. A web service provides the trained models for use in developing a model that may be fine-tuned on a supervised training dataset associated with a software engineering task thereby generating a tool to perform the software engineering task.
    Type: Grant
    Filed: June 30, 2020
    Date of Patent: December 6, 2022
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Colin Bruce Clement, Dawn Drain, Neelakantan Sundaresan, Alexey Svyatkovskiy
  • Patent number: 11513774
    Abstract: A neural transformer model with attention is trained to predict candidates to complete a line of source code with a zero-inference capability. The model is trained on an unsupervised training dataset that includes features from source code written in multiple programming languages. The features include a file-level context and a local context, where the file-level context includes a global context, a class context, a function context, and/or a method context for each class, function and/or method of the source code programs used in the training dataset. The local context includes method bodies, function bodies, and/or stand-alone code of main method routines. From these features, the model is able to learn to predict an ordered sequence of code elements that complete a line of source code in a programming language seen and not seen during training.
    Type: Grant
    Filed: January 3, 2021
    Date of Patent: November 29, 2022
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Colin Bruce Clement, Shuai Lu, Neelakantan Sundaresan, Alexey Svyatkovskiy, Duyu Tang
  • Publication number: 20220308848
    Abstract: An automated system for translating source code written in one programming language into a different programming language utilizes a neural transformer with attention trained on semi-supervised data. The model is jointly pre-trained with a masked language model objective and an autoregressive objective on a large unsupervised source code corpus to learn to comprehend the syntactic structure and semantics of source code. The pre-trained model is then fine-tuned with a token-type prediction objective and an autoregressive objective on supervised translation tasks and data augmented tasks to learn to translate source code from one programming language into a different programming language.
    Type: Application
    Filed: March 25, 2021
    Publication date: September 29, 2022
    Inventors: COLIN BRUCE CLEMENT, DAWN DRAIN, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, CHEN WU
  • Publication number: 20220253712
    Abstract: An example generator tool generates an example illustrating correct usage of a command of a command line interface. A command may include a command name, zero or more subcommands, and one or more parameters with a corresponding parameter value. A template containing the correct syntax of the command is obtained from a template database. Parameter values for the template are generated from a neural transformer with attention given the command template.
    Type: Application
    Filed: April 19, 2021
    Publication date: August 11, 2022
    Inventors: COLIN BRUCE CLEMENT, ROSHANAK ZILOUCHIAN MOGHADDAM, NEELAKANTAN SUNDARESAN
  • Publication number: 20220244952
    Abstract: A source code generation system uses a neural transformer model with attention to predict candidate method bodies given a method docstring, method signature, and one or more method templates. The method templates are derived from intent-snippet pairs from StackOverflow question/answer pairs or template methods from GitHub. Joint embeddings are generated for the method bodies of the method templates and associated method docstrings for quick retrieval. A code completion system uses the source code generation system to generate candidate method bodies to complete a method signature and/or method docstring using the method templates.
    Type: Application
    Filed: April 1, 2021
    Publication date: August 4, 2022
    Inventors: MIKHAIL BRESLAV, COLIN BRUCE CLEMENT, DAWN DRAIN, CHANGRAN HU, NEELAKANTAN SUNDARESAN, CHEN WU
  • Publication number: 20220245056
    Abstract: An automated program repair system uses a neural transformer model with attention to predict a bug-free version of a method having a source code bug identified in an associated stack trace. The neural transformer model is pre-trained with English language text and the source code of a target programming language. The pre-trained neural transformer model is trained to create synthetic bugs in bug-free methods. The bug-free methods with the synthetic bugs are executed with a test case to obtain a stack trace of the source code bug. The method with the synthetic bug, without the bug, and its stack trace are used to train the neural transformer model to predict repairs for buggy methods.
    Type: Application
    Filed: March 25, 2021
    Publication date: August 4, 2022
    Inventors: COLIN BRUCE CLEMENT, DAWN DRAIN, GUILLERMO SERRATO CASTILLA, NEELAKANTAN SUNDARESAN
  • Publication number: 20220214863
    Abstract: A neural transformer model with attention is trained to predict candidates to complete a line of source code with a zero-inference capability. The model is trained on an unsupervised training dataset that includes features from source code written in multiple programming languages. The features include a file-level context and a local context, where the file-level context includes a global context, a class context, a function context, and/or a method context for each class, function and/or method of the source code programs used in the training dataset. The local context includes method bodies, function bodies, and/or stand-alone code of main method routines. From these features, the model is able to learn to predict an ordered sequence of code elements that complete a line of source code in a programming language seen and not seen during training.
    Type: Application
    Filed: January 3, 2021
    Publication date: July 7, 2022
    Inventors: COLIN BRUCE CLEMENT, SHUAI LU, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, DUYU TANG
  • Patent number: 11208706
    Abstract: The present invention discloses a system and method for using a pressurized oxy-fired configuration to conduct metal reduction. The invention discloses a process for production of metal from metal oxide ore through reduction, comprising: (a) feeding a mixture of metal oxide ore, fuel and supply of oxygen into the inlet of a metallization reactor, (b) heating the mixture of metal oxide ore, oxygen and fuel in a primary reduction zone of the metallization reactor at a pressure exceeding ambient pressure to produce a product mixture; and (c) separating the product mixture in a gas separation unit at the bottom or downstream of the metallization reactor.
    Type: Grant
    Filed: April 26, 2017
    Date of Patent: December 28, 2021
    Assignee: HER MAJESTY THE QUEEN IN RIGHT OF CANADA AS REPRESENTED BY THE MINISTER OF NATURAL RESOURCES
    Inventors: Bruce Clements, Mohammad Sameer Asiri, Marc Alexander Duchesne, Robin William Hughes
  • Publication number: 20210357762
    Abstract: A transfer learning system is used for the development of neural transformer models pertaining to software engineering tasks. The transfer learning system trains source code domain neural transformer models with attention in various configurations on a large corpus of unsupervised training dataset of source code programs and/or source code-related natural language text. A web service provides the trained models for use in developing a model that may be fine-tuned on a supervised training dataset associated with a software engineering task thereby generating a tool to perform the software engineering task.
    Type: Application
    Filed: June 30, 2020
    Publication date: November 18, 2021
    Inventors: COLIN BRUCE CLEMENT, JAMES DRAIN, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY
  • Publication number: 20210357210
    Abstract: A code completion tool uses a neural transformer model with attention to generate code documentation for a method in a particular code documentation style. The neural transformer model is trained with source code programs and natural language text. The neural transformer model is pre-trained to learn the meaning of a method name, its corresponding method parameters and types from a large corpus of unsupervised dataset of source code methods. The neural transformer model is then fine-tuned on translation tasks where the model leans to translate a method signature/method body into a docstring of particular code documentation style.
    Type: Application
    Filed: June 10, 2020
    Publication date: November 18, 2021
    Inventors: COLIN BRUCE CLEMENT, JAMES DRAIN, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY
  • Publication number: 20210357187
    Abstract: A code completion tool uses a neural transformer model with attention to generate candidate sequences to complete a method body of a method signature. The neural transformer model is trained with source code programs and natural language text. The neural transformer model learns the meaning of a method name, its corresponding method parameters and types from a large corpus of unsupervised dataset of source code methods and a supervised dataset of tasks including source code constructs in combination with natural language docstrings to infer a candidate sequence of subtokens that represent a method body for a particular method signature.
    Type: Application
    Filed: June 10, 2020
    Publication date: November 18, 2021
    Inventors: COLIN BRUCE CLEMENT, JAMES DRAIN, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY