Patents by Inventor Neelakantan Sundaresan

Neelakantan Sundaresan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11972232
    Abstract: A code completion tool uses a neural transformer model with attention to generate candidate sequences to complete a method body of a method signature. The neural transformer model is trained with source code programs and natural language text. The neural transformer model learns the meaning of a method name, its corresponding method parameters and types from a large corpus of unsupervised dataset of source code methods and a supervised dataset of tasks including source code constructs in combination with natural language docstrings to infer a candidate sequence of subtokens that represent a method body for a particular method signature.
    Type: Grant
    Filed: June 10, 2020
    Date of Patent: April 30, 2024
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Colin Bruce Clement, Dawn Drain, Neelakantan Sundaresan, Alexey Svyatkovskiy
  • Publication number: 20240134614
    Abstract: A source code patch generation system uses the context of a buggy source code snippet of a source code program and a hint to predict a source code segment that repairs the buggy source code snippet. The hint is a source code segment that is semantically-similar to the buggy source code snippet where the similarity is based on a context of the buggy source code snippet. An autoregressive deep learning model uses the context of the buggy source code snippet and the hint to predict the most likely source code segment to repair the buggy source code snippet.
    Type: Application
    Filed: October 14, 2022
    Publication date: April 25, 2024
    Inventors: AMANDEEP SINGH BAKSHI, XIN SHI, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY
  • Patent number: 11962634
    Abstract: The present disclosure is directed to providing supplemental content to one or more client devices requesting multimedia content. The supplemental content may be determined or selected based on the availability of one or more device channels, such as a display, speakers, or other component of the one or more client devices capable of providing an output. The supplemental content may also be selected based on one or more characteristics of the requested multimedia content, such as a genre, subject matter, or duration. Furthermore, the supplemental content may be determined or selected based on the portions of the requested multimedia content that are the most prominent or significant, such as any audio content, any video content, and/or any textual content. The supplemental content may be provided to the one or more client devices such that it is displayed before, during, or after the display of the requested multimedia content.
    Type: Grant
    Filed: December 20, 2021
    Date of Patent: April 16, 2024
    Assignee: eBay Inc.
    Inventor: Neelakantan Sundaresan
  • Patent number: 11954429
    Abstract: Generally discussed herein are devices, systems, and methods for generating an automatic interactive digital notebook completion model. A method can include receiving notebook content of an interactive digital notebook, the notebook content including a markdown cell followed by a code cell. The method can include generating input/output examples by, for each input/output example by masking one of (i) content of the markdown cell or (ii) content of the code cell resulting in a masked cell, identifying the masked cell and content of another cell of the markdown cell or the code that is not masked as an input for an input/output example, and identifying the content of the masked cell as an output for the input/output example. The method can include training, based on the input/output examples, a natural language processing model that generates a prediction of the content of a second masked cell as an output.
    Type: Grant
    Filed: December 8, 2021
    Date of Patent: April 9, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Colin Bruce Clement, Shubham Chandel, Guillermo Serrato Castilla, Neelakantan Sundaresan
  • Patent number: 11947935
    Abstract: Custom source code generation models are generated by tuning a pre-trained deep learning model by freezing the model parameters and optimizing a prefix. The tuning process is distributed across a user space and a model space where the embedding and output layers are performed in the user space and the execution of the model is performed in a model space that is isolated from the user space. The tuning process updates the embeddings of the prefix across the separate execution spaces in a manner that preserves the privacy of the data used in the tuning process.
    Type: Grant
    Filed: November 24, 2021
    Date of Patent: April 2, 2024
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Colin Bruce Clement, Neelakantan Sundaresan, Alexey Svyatkovskiy, Michele Tufano, Andrei Zlotchevski
  • Publication number: 20240104001
    Abstract: A debugging tool identifies the smallest subset of an input sequence or rationales that influenced a neural language model to generate an output sequence. The debugging tool uses the rationales to understand why the model made its predictions and in particular, the particular input tokens that had the most impact on the output sequence. In the case of erroneous output, the rationales are used to alter the input sequence to avoid the error or to tailor a new training dataset to retrain the model to improve its performance.
    Type: Application
    Filed: December 15, 2022
    Publication date: March 28, 2024
    Inventors: COLIN BRUCE CLEMENT, DAVID ALBERTO NADER PALACIO, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, MICHELE TUFANO
  • Patent number: 11941373
    Abstract: A deep learning model trained to learn to predict source code is tuned for a target source code generation task through reinforcement learning using a reward score that considers the quality of the source code predicted during the tuning process. The reward score is adjusted to consider code-quality factors and source code metrics. The code-quality factors account for the predicted source code having syntactic correctness, successful compilation, successful execution, successful invocation, readability, functional correctness, and coverage. The source code metrics generate a score based on how close the predicted source code is to a ground truth code.
    Type: Grant
    Filed: December 17, 2021
    Date of Patent: March 26, 2024
    Assignee: Microsoft Technology Licensing, LLC.
    Inventors: Shao Kun Deng, Neelakantan Sundaresan, Alexey Svyatkovskiy, Michele Tufano
  • Publication number: 20240095807
    Abstract: Apparatus and method for providing contextual recommendations based on user state are disclosed herein. In some embodiments, sensor data corresponding to at least one sensor included in an item worn by a user is received. A user state is determined based on the received sensor data. In response to a state change being satisfied by at least the user state, a recommendation is determined based on the user state and a profile associated with the user. The recommendation may be presented on an electronic mobile device associated with the user.
    Type: Application
    Filed: November 28, 2023
    Publication date: March 21, 2024
    Applicant: eBay Inc.
    Inventors: Anurag Bhardwaj, Neelakantan Sundaresan, Robinson Piramuthu
  • Publication number: 20240095253
    Abstract: A method and a system process a stream of data in parallel across a plurality of nodes. The log processing system has a log module, a query language module, and a query processing module. The log module receives and organizes the stream of data into a sequential and nested data structure. The query language operator module defines operators that operate on the sequential and nested data structure. The query processing module processes in parallel across a plurality of nodes a query based on an operator on the stream of data.
    Type: Application
    Filed: September 25, 2023
    Publication date: March 21, 2024
    Applicant: eBay Inc.
    Inventors: Gyanit Singh, Chi-Hsien Chiu, Neelakantan Sundaresan
  • Publication number: 20240070053
    Abstract: An assert statement generator employs a neural transformer model with attention to generate candidate assert statements for a unit test method that tests a focal method. The neural transformer model is pre-trained with source code programs and natural language text and fine-tuned with test-assert triplets. A test-assert triplet includes a source code snippet that includes: (1) a unit test method with an assert placeholder; (2) the focal method; and (3) a corresponding assert statement. In this manner, the neural transformer model is trained to learn the semantics and statistical properties of a natural language, the syntax of a programming language, and the relationships between the code elements of the programming language and the syntax of an assert statement.
    Type: Application
    Filed: October 23, 2023
    Publication date: February 29, 2024
    Inventors: DAWN DRAIN, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, MICHELE TUFANO
  • Publication number: 20240061655
    Abstract: A code generation system uses a non-terminal expansion model and a non-terminal selector model to generate a code sketch to complete a partially-formed source code snippet. The non-terminal expansion model is a neural transformer model trained on a supervised dataset through reinforcement learning to learn to predict the production rule to expand for a given non-terminal symbol. The non-terminal selector model is trained through reinforcement learning to predict the non-terminal symbol to expand given a partial-code state. The models are used in a two-step beam search to generate the top candidate code sketches, where a candidate code sketch may contain a hole that represents an unexpanded non-terminal symbol.
    Type: Application
    Filed: November 3, 2023
    Publication date: February 22, 2024
    Inventors: MILTIADIS ALLAMANIS, DAYA GUO, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY
  • Patent number: 11900261
    Abstract: A transfer learning system is used for the development of neural transformer models pertaining to software engineering tasks. The transfer learning system trains source code domain neural transformer models with attention in various configurations on a large corpus of unsupervised training dataset of source code programs and/or source code-related natural language text. A web service provides the trained models for use in developing a model that may be fine-tuned on a supervised training dataset associated with a software engineering task thereby generating a tool to perform the software engineering task.
    Type: Grant
    Filed: November 6, 2022
    Date of Patent: February 13, 2024
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Colin Bruce Clement, Dawn Drain, Neelakantan Sundaresan, Alexey Svyatkovskiy
  • Patent number: 11893363
    Abstract: A unit test generation system employs a neural transformer model with attention to generate candidate unit test sequences given a focal method of a programming language. The neural transformer model is pre-trained with source code programs and natural language text and fine-tuned with mapped test case pairs. A mapped test case pair includes a focal method and a unit test case for the focal method. In this manner, the neural transformer model is trained to learn the semantics and statistical properties of a natural language, the syntax of a programming language and the relationships between the code elements of the programming language and the syntax of a unit test case.
    Type: Grant
    Filed: October 27, 2020
    Date of Patent: February 6, 2024
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Dawn Drain, Neelakantan Sundaresan, Alexey Svyatkovskiy, Michele Tufano
  • Publication number: 20240028306
    Abstract: A code completion tool uses a neural transformer model to generate candidate sequences to complete a line of source code. The neural transformer model is trained using a conditional language modeling objective on a large unsupervised dataset that includes source code programs written in several different programming languages. The neural transformer model is used within a beam search that predicts the most likely candidate sequences for a code snippet under development.
    Type: Application
    Filed: August 9, 2023
    Publication date: January 25, 2024
    Inventors: ALEXEY SVYATKOVSKIY, SHENGYU FU, NEELAKANTAN SUNDARESAN, SHAO KUN DENG
  • Publication number: 20240028740
    Abstract: A neural classifier model is used to detect cybersecurity vulnerabilities in the source code predicted by a deep learning code generation model having been trained on source code possibly containing security bugs. Upon the classifier model classifying a given source code snippet as likely containing a cybersecurity vulnerability, a proposed repair for the cybersecurity vulnerability is predicted from a neural decoder transformer model having been trained on non-vulnerable source code. The neural decoder transformer model is used to predict source code that repairs the cybersecurity vulnerability given the source code classified with a cybersecurity vulnerability.
    Type: Application
    Filed: September 21, 2022
    Publication date: January 25, 2024
    Inventors: AARON YUE-CHIU CHAN, COLIN BRUCE CLEMENT, YEVHEN MOHYLEVSKYY, NEELAKANTAN SUNDARESAN, ROSHANAK ZILOUCHIAN MOGHADDAM
  • Patent number: 11869061
    Abstract: Apparatus and method for providing contextual recommendations based on user state are disclosed herein. In some embodiments, sensor data corresponding to at least one sensor included in an item worn by a user is received. A user state is determined based on the received sensor data. In response to a state change being satisfied by at least the user state, a recommendation is determined based on the user state and a profile associated with the user. The recommendation may be presented on an electronic mobile device associated with the user.
    Type: Grant
    Filed: February 16, 2021
    Date of Patent: January 9, 2024
    Assignee: eBay Inc.
    Inventors: Anurag Bhardwaj, Neelakantan Sundaresan, Robinson Piramuthu
  • Publication number: 20230409299
    Abstract: A code insertion engine predicts one or more statements of a programming language to be inserted at an insertion point in between existing source code statements of a source code program being edited. The code insertion engine extracts the surrounding context of the insertion point which includes the source code immediately preceding and the source code immediately following the insertion point. The code insertion engine uses a neural expansion model and a neural selector model to predict the one or more statements most likely to be inserted into the insertion point that are syntactically and semantically consistent with the surrounding context of the existing program.
    Type: Application
    Filed: June 16, 2022
    Publication date: December 21, 2023
    Inventors: NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY
  • Patent number: 11836467
    Abstract: A code generation system uses a non-terminal expansion model and a non-terminal selector model to generate a code sketch to complete a partially-formed source code snippet. The non-terminal expansion model is a neural transformer model trained on a supervised dataset through reinforcement learning to learn to predict the production rule to expand for a given non-terminal symbol. The non-terminal selector model is trained through reinforcement learning to predict the non-terminal symbol to expand given a partial-code state. The models are used in a two-step beam search to generate the top candidate code sketches, where a candidate code sketch may contain a hole that represents an unexpanded non-terminal symbol.
    Type: Grant
    Filed: August 16, 2021
    Date of Patent: December 5, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Miltiadis Allamanis, Daya Guo, Neelakantan Sundaresan, Alexey Svyatkovskiy
  • Patent number: 11829282
    Abstract: An assert statement generator employs a neural transformer model with attention to generate candidate assert statements for a unit test method that tests a focal method. The neural transformer model is pre-trained with source code programs and natural language text and fine-tuned with test-assert triplets. A test-assert triplet includes a source code snippet that includes: (1) a unit test method with an assert placeholder; (2) the focal method; and (3) a corresponding assert statement. In this manner, the neural transformer model is trained to learn the semantics and statistical properties of a natural language, the syntax of a programming language, and the relationships between the code elements of the programming language and the syntax of an assert statement.
    Type: Grant
    Filed: October 27, 2020
    Date of Patent: November 28, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Dawn Drain, Neelakantan Sundaresan, Alexey Svyatkovskiy, Michele Tufano
  • Publication number: 20230362216
    Abstract: A system, computer-readable storage medium storing at least one program, and computer-implemented method for providing recommendations based on social network sharing activity. Sharing activity relating to the sharing of the content item on a social network by a first user is accessed. Consumption information related to the consumption of the content item. A correlation between the sharing activity and the consumption information is determined. A recommendation is then generated based on the correlation.
    Type: Application
    Filed: June 6, 2023
    Publication date: November 9, 2023
    Inventors: Neelakantan Sundaresan, Atish Das Sarma, Si Si, Elizabeth Churchill