Patents by Inventor James Drain

James Drain has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20220066747
    Abstract: A unit test generation system employs a neural transformer model with attention to generate candidate unit test sequences given a focal method of a programming language. The neural transformer model is pre-trained with source code programs and natural language text and fine-tuned with mapped test case pairs. A mapped test case pair includes a focal method and a unit test case for the focal method. In this manner, the neural transformer model is trained to learn the semantics and statistical properties of a natural language, the syntax of a programming language and the relationships between the code elements of the programming language and the syntax of a unit test case.
    Type: Application
    Filed: October 27, 2020
    Publication date: March 3, 2022
    Inventors: JAMES DRAIN, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, MICHELE TUFANO
  • Publication number: 20220066914
    Abstract: An assert statement generator employs a neural transformer model with attention to generate candidate assert statements for a unit test method that tests a focal method. The neural transformer model is pre-trained with source code programs and natural language text and fine-tuned with test-assert triplets. A test-assert triplet includes a source code snippet that includes: (1) a unit test method with an assert placeholder; (2) the focal method; and (3) a corresponding assert statement. In this manner, the neural transformer model is trained to learn the semantics and statistical properties of a natural language, the syntax of a programming language, and the relationships between the code elements of the programming language and the syntax of an assert statement.
    Type: Application
    Filed: October 27, 2020
    Publication date: March 3, 2022
    Inventors: JAMES DRAIN, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, MICHELE TUFANO
  • Publication number: 20210357187
    Abstract: A code completion tool uses a neural transformer model with attention to generate candidate sequences to complete a method body of a method signature. The neural transformer model is trained with source code programs and natural language text. The neural transformer model learns the meaning of a method name, its corresponding method parameters and types from a large corpus of unsupervised dataset of source code methods and a supervised dataset of tasks including source code constructs in combination with natural language docstrings to infer a candidate sequence of subtokens that represent a method body for a particular method signature.
    Type: Application
    Filed: June 10, 2020
    Publication date: November 18, 2021
    Inventors: COLIN BRUCE CLEMENT, JAMES DRAIN, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY
  • Publication number: 20210357762
    Abstract: A transfer learning system is used for the development of neural transformer models pertaining to software engineering tasks. The transfer learning system trains source code domain neural transformer models with attention in various configurations on a large corpus of unsupervised training dataset of source code programs and/or source code-related natural language text. A web service provides the trained models for use in developing a model that may be fine-tuned on a supervised training dataset associated with a software engineering task thereby generating a tool to perform the software engineering task.
    Type: Application
    Filed: June 30, 2020
    Publication date: November 18, 2021
    Inventors: COLIN BRUCE CLEMENT, JAMES DRAIN, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY
  • Publication number: 20210357210
    Abstract: A code completion tool uses a neural transformer model with attention to generate code documentation for a method in a particular code documentation style. The neural transformer model is trained with source code programs and natural language text. The neural transformer model is pre-trained to learn the meaning of a method name, its corresponding method parameters and types from a large corpus of unsupervised dataset of source code methods. The neural transformer model is then fine-tuned on translation tasks where the model leans to translate a method signature/method body into a docstring of particular code documentation style.
    Type: Application
    Filed: June 10, 2020
    Publication date: November 18, 2021
    Inventors: COLIN BRUCE CLEMENT, JAMES DRAIN, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY
  • Publication number: 20070155287
    Abstract: A polishing machine for optical elements, includes: a spindle arranged to rotationally drive an optical element; a polishing tool mobile relative to the spindle; a front face provided with a door enabling the access to the spindle and to the polishing tool; wherein the polishing tool is mounted on a body which is rotationally mounted on sliding elements by way of a first axis, the sliding elements being substantially perpendicular to the front face.
    Type: Application
    Filed: December 30, 2005
    Publication date: July 5, 2007
    Inventors: James Drain, John Keller, Steven Reid, Joseph Bond, Maggy Perrier, Laurent Marcepoil, Eric Comte
  • Publication number: 20070155286
    Abstract: A polishing machine for optical elements, comprising: a spindle arranged to rotationally drive an optical element; a polishing tool mobile relative to the spindle; wherein the polishing machine further comprises a platform mounted on top of a work chamber, the work chamber comprising the spindle, and the platform holding a body on which is mounted the polishing tool.
    Type: Application
    Filed: December 30, 2005
    Publication date: July 5, 2007
    Inventors: James Drain, John Keller, Steven Reid, Joseph Bond, Maggy Perrier, Laurent Marcepoil, Eric Comte