Patents by Inventor Halley Fede

Halley Fede has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11334467
    Abstract: A computer-implemented method, system and computer program product for representing source code in vector space. The source code is parsed into an abstract syntax tree, which is then traversed to produce a sequence of tokens. Token embeddings may then be constructed for a subset of the sequence of tokens, which are inputted into an encoder artificial neural network (“encoder”) for encoding the token embeddings. A decoder artificial neural network (“decoder”) is initialized with a final internal cell state of the encoder. The decoder is run the same number of steps as the encoding performed by the encoder. After running the decoder and completing the training of the decoder to learn the inputted token embeddings, the final internal cell state of the encoder is used as the code representation vector which may be used to detect errors in the source code.
    Type: Grant
    Filed: May 3, 2019
    Date of Patent: May 17, 2022
    Assignee: International Business Machines Corporation
    Inventors: David Wehr, Eleanor Pence, Halley Fede, Isabella Yamin, Alexander Sobran, Bo Zhang
  • Patent number: 11238306
    Abstract: A method, system and computer program product for obtaining vector representations of code snippets capturing semantic similarity. A first and second training set of code snippets are collected, where the first training set of code snippets implements the same function representing semantic similarity and the second training set of code snippets implements a different function representing semantic dissimilarity. A vector representation of a first and second code snippet from either the first or second training set of code snippets is generated using a machine learning model. A loss value is generated utilizing a loss function that is proportional or inverse to the distance between the first and second vectors in response to receiving the first and second code snippets from the first or second training set of code snippets, respectively. The machine learning model is trained to capture the semantic similarity in the code snippets by minimizing the loss value.
    Type: Grant
    Filed: September 27, 2018
    Date of Patent: February 1, 2022
    Assignee: International Business Machines Corporation
    Inventors: Bo Zhang, Alexander Sobran, David Wehr, Halley Fede, Eleanor Pence, Joseph Hughes, John H. Walczyk, III, Guilherme Ferreira
  • Publication number: 20200349052
    Abstract: A computer-implemented method, system and computer program product for representing source code in vector space. The source code is parsed into an abstract syntax tree, which is then traversed to produce a sequence of tokens. Token embeddings may then be constructed for a subset of the sequence of tokens, which are inputted into an encoder artificial neural network (“encoder”) for encoding the token embeddings. A decoder artificial neural network (“decoder”) is initialized with a final internal cell state of the encoder. The decoder is run the same number of steps as the encoding performed by the encoder. After running the decoder and completing the training of the decoder to learn the inputted token embeddings, the final internal cell state of the encoder is used as the code representation vector which may be used to detect errors in the source code.
    Type: Application
    Filed: May 3, 2019
    Publication date: November 5, 2020
    Inventors: David Wehr, Eleanor Pence, Halley Fede, Isabella Yamin, Alexander Sobran, Bo Zhang
  • Publication number: 20200104631
    Abstract: A method, system and computer program product for obtaining vector representations of code snippets capturing semantic similarity. A first and second training set of code snippets are collected, where the first training set of code snippets implements the same function representing semantic similarity and the second training set of code snippets implements a different function representing semantic dissimilarity. A vector representation of a first and second code snippet from either the first or second training set of code snippets is generated using a machine learning model. A loss value is generated utilizing a loss function that is proportional or inverse to the distance between the first and second vectors in response to receiving the first and second code snippets from the first or second training set of code snippets, respectively. The machine learning model is trained to capture the semantic similarity in the code snippets by minimizing the loss value.
    Type: Application
    Filed: September 27, 2018
    Publication date: April 2, 2020
    Inventors: Bo Zhang, Alexander Sobran, David Wehr, Halley Fede, Eleanor Pence, Joseph Hughes, John H. Walczyk, III, Guilherme Ferreira