Patents by Inventor Neelakantan Sundaresan

Neelakantan Sundaresan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240248686
    Abstract: A pre-trained neural code generation model generates repair code for a method containing a performance bug given a prompt including a code transformation instruction. The code transformation instruction guides the model on how to predict the repair code when the model has not been fine-tuned for the repair code task. The code transformation instruction is retrieved from abstract bug patterns derived from historical performance bug fixes found in commits to a source code repository. The augmentation of the code transformation instruction in the prompt to the pre-trained neural code generation model provides the model with a hint on how the repair code may be generated based on similar performance bug fixes.
    Type: Application
    Filed: March 21, 2023
    Publication date: July 25, 2024
    Inventors: SPANDAN GARG, NEELAKANTAN SUNDARESAN, ROSHANAK ZILOUCHIAN MOGHADDAM
  • Patent number: 12045592
    Abstract: An automated system for translating source code written in one programming language into a different programming language utilizes a neural transformer with attention trained on semi-supervised data. The model is jointly pre-trained with a masked language model objective and an autoregressive objective on a large unsupervised source code corpus to learn to comprehend the syntactic structure and semantics of source code. The pre-trained model is then fine-tuned with a token-type prediction objective and an autoregressive objective on supervised translation tasks and data augmented tasks to learn to translate source code from one programming language into a different programming language.
    Type: Grant
    Filed: March 25, 2021
    Date of Patent: July 23, 2024
    Assignee: Microsoft Technology Licensing, LLC.
    Inventors: Colin Bruce Clement, Dawn Drain, Neelakantan Sundaresan, Alexey Svyatkovskiy, Chen Wu
  • Patent number: 12039295
    Abstract: A code completion tool uses a neural transformer model with attention to generate syntactically-correct candidates with holes to complete a partially-formed code snippet. The model is trained to predict the expansion of non-terminal symbols of the production rules of the underlying grammar of the code snippet without being constrained to a left-to-right expansion order. A hole is a non-terminal symbol of the grammar of a programming language that marks a position in a candidate where the code completion engine is not certain of the production rule that should be used to expand the non-terminal symbol. The hole allows the code completion engine to expand other non-terminal symbols in a candidate and allow the user to guide the expansion of the holes in a candidate.
    Type: Grant
    Filed: May 15, 2021
    Date of Patent: July 16, 2024
    Assignee: Microsoft Technology Licensing, LLC.
    Inventors: Miltiadis Allamanis, Daya Guo, Shao Kun Deng, Neelakantan Sundaresan, Alexey Svyatkovskiy
  • Publication number: 20240232519
    Abstract: Generally discussed herein are devices, systems, and methods for generating an automatic interactive digital notebook completion model. A method can include receiving notebook content of an interactive digital notebook, the notebook content including a markdown cell followed by a code cell. The method can include generating input/output examples by, for each input/output example by masking one of (i) content of the markdown cell or (ii) content of the code cell resulting in a masked cell, identifying the masked cell and content of another cell of the markdown cell or the code that is not masked as an input for an input/output example, and identifying the content of the masked cell as an output for the input/output example. The method can include training, based on the input/output examples, a natural language processing model that generates a prediction of the content of a second masked cell as an output.
    Type: Application
    Filed: March 20, 2024
    Publication date: July 11, 2024
    Inventors: Colin Bruce CLEMENT, Shubham Chandel, Guillermo Serrato Castilla, Neelakantan Sundaresan
  • Patent number: 12032936
    Abstract: A code adaptation mechanism automatically integrates the variable names of a pasted source code snippet into variable names defined in a pre-existing partial source code program. The variable names from the pasted source code snippet are replaced with anonymized values. A deep learning model predicts the most likely variable name from the pre-existing partial source code program to replace each anonymized value. The deep learning model is trained on numerous variable usage patterns from various source code programs to learn to predict the most likely mapping of an undefined variable name from the pasted source code snippet to a variable name in the pre-existing partial source code program thereby generating a syntactically and semantically correct program.
    Type: Grant
    Filed: March 24, 2022
    Date of Patent: July 9, 2024
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Miltiadis Allamanis, Shengyu Fu, Xiaoyu Liu, Neelakantan Sundaresan, Alexey Svyatkovskiy
  • Publication number: 20240220215
    Abstract: Custom source code generation models are generated by tuning a pre-trained deep learning model by freezing the model parameters and optimizing a prefix. The tuning process is distributed across a user space and a model space where the embedding and output layers are performed in the user space and the execution of the model is performed in a model space that is isolated from the user space. The tuning process updates the embeddings of the prefix across the separate execution spaces in a manner that preserves the privacy of the data used in the tuning process.
    Type: Application
    Filed: March 13, 2024
    Publication date: July 4, 2024
    Inventors: COLIN BRUCE CLEMENT, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, MICHELE TUFANO, ANDREI ZLOTCHEVSKI
  • Publication number: 20240223640
    Abstract: The present disclosure is directed to providing supplemental content to one or more client devices requesting multimedia content. The supplemental content may be determined or selected based on the availability of one or more device channels, such as a display, speakers, or other component of the one or more client devices capable of providing an output. Furthermore, the supplemental content may be determined or selected based on the portions of the requested multimedia content that are the most prominent or significant, such as any audio content, any video content, and/or any textual content. The supplemental content may be provided to the one or more client devices such that it is displayed before, during, or after the display of the requested multimedia content.
    Type: Application
    Filed: March 12, 2024
    Publication date: July 4, 2024
    Inventor: Neelakantan SUNDARESAN
  • Publication number: 20240211224
    Abstract: A neural transcompilation model is tested with a set of syntax unit tests to determine the syntax elements of a source code program written in a source programming language that fail to translate properly into a target programming language. The syntax elements having a translation defect is identified and ranked according a translation failure rate. The neural transcompilation model is then fine-tuned with training samples of the syntax elements having the highest translation failure rates and their paired correct translation in order to teach the model to learn the association between the syntax elements of a source programming language causing translation defects and its correct translation in a target programming language.
    Type: Application
    Filed: December 23, 2022
    Publication date: June 27, 2024
    Inventors: COLIN BRUCE CLEMENT, YUFAN HUANG, NEELAKANTAN SUNDARESAN, YIDING TIAN, MAOQUAN WANG
  • Publication number: 20240192927
    Abstract: A deep learning model trained to learn to predict source code is tuned for a target source code generation task through reinforcement learning using a reward score that considers the quality of the source code predicted during the tuning process. The reward score is adjusted to consider code-quality factors and source code metrics. The code-quality factors account for the predicted source code having syntactic correctness, successful compilation, successful execution, successful invocation, readability, functional correctness, and coverage. The source code metrics generate a score based on how close the predicted source code is to a ground truth code.
    Type: Application
    Filed: February 20, 2024
    Publication date: June 13, 2024
    Inventors: SHAO KUN DENG, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, MICHELE TUFANO
  • Publication number: 20240184570
    Abstract: A retrieval-augmented neural transformer model with chunk cross-attention predicts a code review given a proposed source code change, represented as a code diff hunk, and a set of historical code review comments. The code diff hunk represents proposed edits to a source code snippet with its surrounding context that has not been changed. The historical code review comments are associated with code edits that are semantically similar to the proposed source code changes. The code diff hunk is partitioned into chunks which are used to find semantically similar historical code review comments. The set of historical code review comments is aggregated and used to guide the model in makings its predictions.
    Type: Application
    Filed: December 5, 2022
    Publication date: June 6, 2024
    Inventors: SHENGYU FU, XIAOYU LIU, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY
  • Patent number: 12002264
    Abstract: In various example embodiments, a system and method for using camera metadata for making recommendations are presented. At least one image file having camera metadata is received. The camera metadata of the at least one image file is analyzed to determine improvements to image capture aspects associated with the at least one image file. Feedback related to the improvements to the image capture aspects associated with the at least one image file is generated. In some embodiments, the feedback may be used to generate camera and other product upgrade recommendations.
    Type: Grant
    Filed: May 26, 2023
    Date of Patent: June 4, 2024
    Assignee: eBay Inc.
    Inventor: Neelakantan Sundaresan
  • Patent number: 11995705
    Abstract: In various example embodiments, a system and method for an electronic commerce file system are provided. In example embodiments, a selection of an item contained in a folder of an electronic commerce file system is received. The item is offered for sale by an electronic commerce provider, and the electronic commerce file system resides locally on a client device. Based on a type of the folder, a set of actions are provided for selection, with the set of actions to be performed with respect to the item. A selection of an action to be performed with respect to the item is received. The action is performed with respect to the item, with the action being performed between the electronic commerce file system and the electronic commerce provider via a network.
    Type: Grant
    Filed: December 20, 2022
    Date of Patent: May 28, 2024
    Assignee: EBAY INC.
    Inventors: Sandra Lynn Godsey, Neelakantan Sundaresan
  • Publication number: 20240160435
    Abstract: A deep learning model is pre-trained with a large-scale of unsupervised data of code review tasks in order to learn the relationships between code changes and a code review. The pre-trained deep learning model predicts a code review given a code diff hunk in a code diff format. The code diff hunk includes the changed code and its surrounding context. The pre-trained deep learning model may then be fine-tuned with supervised data in order to make predictions for several code review activities, such as, code change quality estimation and code refinement.
    Type: Application
    Filed: November 12, 2022
    Publication date: May 16, 2024
    Inventors: NAN DUAN, SHENGYU FU, SHUAI LU, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY
  • Publication number: 20240160940
    Abstract: A transfer learning system is used for the development of neural transformer models pertaining to software engineering tasks. The transfer learning system trains source code domain neural transformer models with attention in various configurations on a large corpus of unsupervised training dataset of source code programs and/or source code-related natural language text. A web service provides the trained models for use in developing a model that may be fine-tuned on a supervised training dataset associated with a software engineering task thereby generating a tool to perform the software engineering task.
    Type: Application
    Filed: January 17, 2024
    Publication date: May 16, 2024
    Inventors: COLIN BRUCE CLEMENT, DAWN DRAIN, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY
  • Patent number: 11983513
    Abstract: A neural transformer model with attention is trained to predict candidates to complete a line of source code with a zero-inference capability. The model is trained on an unsupervised training dataset that includes features from source code written in multiple programming languages. The features include a file-level context and a local context, where the file-level context includes a global context, a class context, a function context, and/or a method context for each class, function and/or method of the source code programs used in the training dataset. The local context includes method bodies, function bodies, and/or stand-alone code of main method routines. From these features, the model is able to learn to predict an ordered sequence of code elements that complete a line of source code in a programming language seen and not seen during training.
    Type: Grant
    Filed: May 24, 2023
    Date of Patent: May 14, 2024
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Colin Bruce Clement, Shuai Lu, Neelakantan Sundaresan, Alexey Svyatkovskiy, Duyu Tang
  • Patent number: 11977474
    Abstract: An automated program repair tool utilizes a neural transformer model with attention to predict the contents of a bug repair in the context of source code having a bug of an identified bug type. The neural transformer model is trained on a large unsupervised corpus of source code using a span-masking denoising optimization objective, and fine-tuned on a large supervised dataset of triplets containing a bug-type annotation, software bug, and repair. The bug-type annotation is derived from an interprocedural static code analyzer. A bug type edit centroid is computed for each bug type and used in the inference decoding phase to generate the bug repair.
    Type: Grant
    Filed: November 25, 2022
    Date of Patent: May 7, 2024
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Shao Kun Deng, Neelakantan Sundaresan, Alexey Svyatkovskiy, Michele Tufano
  • Publication number: 20240143292
    Abstract: A unit test generation system employs a neural transformer model with attention to generate candidate unit test sequences given a focal method of a programming language. The neural transformer model is pre-trained with source code programs and natural language text and fine-tuned with mapped test case pairs. A mapped test case pair includes a focal method and a unit test case for the focal method. In this manner, the neural transformer model is trained to learn the semantics and statistical properties of a natural language, the syntax of a programming language and the relationships between the code elements of the programming language and the syntax of a unit test case.
    Type: Application
    Filed: December 26, 2023
    Publication date: May 2, 2024
    Inventors: DAWN DRAIN, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, MICHELE TUFANO
  • Patent number: 11972232
    Abstract: A code completion tool uses a neural transformer model with attention to generate candidate sequences to complete a method body of a method signature. The neural transformer model is trained with source code programs and natural language text. The neural transformer model learns the meaning of a method name, its corresponding method parameters and types from a large corpus of unsupervised dataset of source code methods and a supervised dataset of tasks including source code constructs in combination with natural language docstrings to infer a candidate sequence of subtokens that represent a method body for a particular method signature.
    Type: Grant
    Filed: June 10, 2020
    Date of Patent: April 30, 2024
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Colin Bruce Clement, Dawn Drain, Neelakantan Sundaresan, Alexey Svyatkovskiy
  • Publication number: 20240134614
    Abstract: A source code patch generation system uses the context of a buggy source code snippet of a source code program and a hint to predict a source code segment that repairs the buggy source code snippet. The hint is a source code segment that is semantically-similar to the buggy source code snippet where the similarity is based on a context of the buggy source code snippet. An autoregressive deep learning model uses the context of the buggy source code snippet and the hint to predict the most likely source code segment to repair the buggy source code snippet.
    Type: Application
    Filed: October 14, 2022
    Publication date: April 25, 2024
    Inventors: AMANDEEP SINGH BAKSHI, XIN SHI, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY
  • Patent number: 11962634
    Abstract: The present disclosure is directed to providing supplemental content to one or more client devices requesting multimedia content. The supplemental content may be determined or selected based on the availability of one or more device channels, such as a display, speakers, or other component of the one or more client devices capable of providing an output. The supplemental content may also be selected based on one or more characteristics of the requested multimedia content, such as a genre, subject matter, or duration. Furthermore, the supplemental content may be determined or selected based on the portions of the requested multimedia content that are the most prominent or significant, such as any audio content, any video content, and/or any textual content. The supplemental content may be provided to the one or more client devices such that it is displayed before, during, or after the display of the requested multimedia content.
    Type: Grant
    Filed: December 20, 2021
    Date of Patent: April 16, 2024
    Assignee: eBay Inc.
    Inventor: Neelakantan Sundaresan