Patents by Inventor Neelakantan Sundaresan

Neelakantan Sundaresan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11455348
    Abstract: A system comprising a computer-readable storage medium storing at least one program, and a computer-implemented method for saving and presenting a state of a communication session are presented. The communication session may be established between a client device and an application server of a content publisher, and may include the presentation of content on the client device. In some embodiments, the method may include receiving user input to save a state of the communication session, and in response, temporarily storing session data representative of the state of the communication session for a predetermined duration of the communication session. The method may further include generating and presenting an interface that includes a visual representation of the session data, and allows a user to return to the saved state of the communication session.
    Type: Grant
    Filed: February 17, 2020
    Date of Patent: September 27, 2022
    Assignee: eBay Inc.
    Inventors: Esmeralda Carrillo, Kristy Brambila, Cassandra Gordon, Enrica Montilla Beltran, Neelakantan Sundaresan
  • Patent number: 11444991
    Abstract: A system, computer-readable storage medium storing at least one program, and computer-implemented method for providing recommendations based on social network sharing activity. Sharing activity relating to the sharing of the content item on a social network by a first user is accessed. Consumption information related to the consumption of the content item. A correlation between the sharing activity and the consumption information is determined. A recommendation is then generated based on the correlation.
    Type: Grant
    Filed: August 3, 2020
    Date of Patent: September 13, 2022
    Assignee: PayPal, Inc.
    Inventors: Neelakantan Sundaresan, Atish Das Sarma, Si Si, Elizabeth Churchill
  • Patent number: 11436236
    Abstract: A term-weighting and document-scoring function is used to search for a command line interface (CLI) script that is likely relevant to an operation specified in a natural language query. CLI scripts are created to perform various operations of a CLI-based application. A CLI script is associated with a description document having keywords associated with the individual commands used in the CLI script. The relevance of a CLI script to an intended operation is based on the term-weighting and document-scoring function which is applied to each component of each command in a CLI script and weighted accordingly.
    Type: Grant
    Filed: May 1, 2020
    Date of Patent: September 6, 2022
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Spandan Garg, Yevhen Mohylevskyy, Jason R. Shaver, Neelakantan Sundaresan, Roshanak Zilouchian Moghaddam
  • Publication number: 20220253712
    Abstract: An example generator tool generates an example illustrating correct usage of a command of a command line interface. A command may include a command name, zero or more subcommands, and one or more parameters with a corresponding parameter value. A template containing the correct syntax of the command is obtained from a template database. Parameter values for the template are generated from a neural transformer with attention given the command template.
    Type: Application
    Filed: April 19, 2021
    Publication date: August 11, 2022
    Inventors: COLIN BRUCE CLEMENT, ROSHANAK ZILOUCHIAN MOGHADDAM, NEELAKANTAN SUNDARESAN
  • Publication number: 20220244952
    Abstract: A source code generation system uses a neural transformer model with attention to predict candidate method bodies given a method docstring, method signature, and one or more method templates. The method templates are derived from intent-snippet pairs from StackOverflow question/answer pairs or template methods from GitHub. Joint embeddings are generated for the method bodies of the method templates and associated method docstrings for quick retrieval. A code completion system uses the source code generation system to generate candidate method bodies to complete a method signature and/or method docstring using the method templates.
    Type: Application
    Filed: April 1, 2021
    Publication date: August 4, 2022
    Inventors: MIKHAIL BRESLAV, COLIN BRUCE CLEMENT, DAWN DRAIN, CHANGRAN HU, NEELAKANTAN SUNDARESAN, CHEN WU
  • Publication number: 20220245056
    Abstract: An automated program repair system uses a neural transformer model with attention to predict a bug-free version of a method having a source code bug identified in an associated stack trace. The neural transformer model is pre-trained with English language text and the source code of a target programming language. The pre-trained neural transformer model is trained to create synthetic bugs in bug-free methods. The bug-free methods with the synthetic bugs are executed with a test case to obtain a stack trace of the source code bug. The method with the synthetic bug, without the bug, and its stack trace are used to train the neural transformer model to predict repairs for buggy methods.
    Type: Application
    Filed: March 25, 2021
    Publication date: August 4, 2022
    Inventors: COLIN BRUCE CLEMENT, DAWN DRAIN, GUILLERMO SERRATO CASTILLA, NEELAKANTAN SUNDARESAN
  • Patent number: 11403207
    Abstract: Runtime errors in a source code program are detected in advance of execution by machine learning models. Features representing a context of a runtime error are extracted from source code programs to train a machine learning model, such as a random forest classifier, to predict the likelihood that a code snippet has a particular type of runtime error. The features are extracted from a syntax-type tree representation of each method in a program. A model is generated for distinct runtime errors, such as arithmetic overflow, and conditionally uninitialized variables.
    Type: Grant
    Filed: February 28, 2020
    Date of Patent: August 2, 2022
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Shaun Miller, Kalpathy Sitaraman Sivaraman, Neelakantan Sundaresan, Yijin Wei, Roshanak Zilouchian Moghaddam
  • Patent number: 11392354
    Abstract: A data mining technique is used to find large frequently-occurring source code patterns from methods/APIs that can be used in code development. Simplified trees that represent the syntactic structure and type and method usage of a source code fragment, such as a method, are mined to find closed and maximal frequent subtrees which represent the largest frequently-occurring source code patterns or idioms associated with a particular type and method usage. These idioms are then used in an idiom web service and/or a code completion system to assist users in the development of source code programs.
    Type: Grant
    Filed: March 31, 2020
    Date of Patent: July 19, 2022
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Christian Alma Bird, Shengyu Fu, Neelakantan Sundaresan, Nina Wang, Shuo Zhang
  • Publication number: 20220214863
    Abstract: A neural transformer model with attention is trained to predict candidates to complete a line of source code with a zero-inference capability. The model is trained on an unsupervised training dataset that includes features from source code written in multiple programming languages. The features include a file-level context and a local context, where the file-level context includes a global context, a class context, a function context, and/or a method context for each class, function and/or method of the source code programs used in the training dataset. The local context includes method bodies, function bodies, and/or stand-alone code of main method routines. From these features, the model is able to learn to predict an ordered sequence of code elements that complete a line of source code in a programming language seen and not seen during training.
    Type: Application
    Filed: January 3, 2021
    Publication date: July 7, 2022
    Inventors: COLIN BRUCE CLEMENT, SHUAI LU, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, DUYU TANG
  • Patent number: 11379190
    Abstract: A code completion tool uses a deep learning model to predict the likelihood of a method completing a method invocation. In one aspect, the deep learning model is a LSTM trained on features that represent the syntactic context of a method invocation derived from an abstract tree representation of the code fragment.
    Type: Grant
    Filed: April 18, 2021
    Date of Patent: July 5, 2022
    Assignee: MICROSOFT TECHNOLOGY LICENSING LLC.
    Inventors: Alexey Svyatkovskiy, Shengyu Fu, Neelakantan Sundaresan, Ying Zhao
  • Publication number: 20220164672
    Abstract: An automated system for resolving program merges uses a sequence-to-sequence supervised machine learning model trained from developer-resolved merge conflicts to learn to predict a merge resolution to resolve a three-way program merge. The model utilizes an embedding of the merge tuple (A, B, O) which represents the program syntax, program semantics and the intent of the program inputs. The model uses a pointer mechanism to construct the resolved program in terms of the lines of source code found in the input programs.
    Type: Application
    Filed: February 12, 2021
    Publication date: May 26, 2022
    Inventors: CHRISTIAN BIRD, ELIZABETH DINELLA, SHUVENDU K. LAHIRI, TODD DOUGLAS MYTKOWICZ, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY
  • Publication number: 20220164626
    Abstract: An automated system for resolving program merges uses neural transformers with attention. In one aspect, a neural encoder transformer model is trained from developer-resolved merge conflicts to learn to predict a resolution strategy that aids a developer in constructing a merged program. In a second aspect, a neural decoder transformer model is trained on the syntax and semantics of different source code programming languages to predict a merge resolution consisting of interleaved lines of source code from programs A, B, or O, where programs A and B contain changes to code base O.
    Type: Application
    Filed: February 12, 2021
    Publication date: May 26, 2022
    Inventors: CHRISTIAN BIRD, SHUVENDU K. LAHIRI, TODD DOUGLAS MYTKOWICZ, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY
  • Publication number: 20220147321
    Abstract: A code completion tool uses a neural transformer model to generate candidate sequences to complete a line of source code. The neural transformer model is trained using a conditional language modeling objective on a large unsupervised dataset that includes source code programs written in several different programming languages. The neural transformer model is used within a beam search that predicts the most likely candidate sequences for a code snippet under development.
    Type: Application
    Filed: January 20, 2022
    Publication date: May 12, 2022
    Inventors: Alexey SVYATKOVSKIY, Shengyu FU, Neelakantan SUNDARESAN, Shao Kun DENG
  • Patent number: 11307831
    Abstract: A code completion system uses neural components to rank the unordered list of code completion candidates generated from an existing static analyzer. The candidates represent the next sequence of tokens likely to complete a partially-formed program element as a developer is typing in a software development tool. A re-ranking component generates a ranked order of the candidates based on a context embedding of the code context and candidate embeddings of the candidates, where both embeddings are based a common token encoding.
    Type: Grant
    Filed: June 15, 2020
    Date of Patent: April 19, 2022
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Miltiadis Allamanis, Shengyu Fu, Xiaoyu Liu, Neelakantan Sundaresan, Alexey Svyatkovskiy
  • Publication number: 20220116483
    Abstract: The present disclosure is directed to providing supplemental content to one or more client devices requesting multimedia content. The supplemental content may be determined or selected based on the availability of one or more device channels, such as a display, speakers, or other component of the one or more client devices capable of providing an output. The supplemental content may also be selected based on one or more characteristics of the requested multimedia content, such as a genre, subject matter, or duration. Furthermore, the supplemental content may be determined or selected based on the portions of the requested multimedia content that are the most prominent or significant, such as any audio content, any video content, and/or any textual content. The supplemental content may be provided to the one or more client devices such that it is displayed before, during, or after the display of the requested multimedia content.
    Type: Application
    Filed: December 20, 2021
    Publication date: April 14, 2022
    Inventor: Neelakantan Sundaresan
  • Publication number: 20220066747
    Abstract: A unit test generation system employs a neural transformer model with attention to generate candidate unit test sequences given a focal method of a programming language. The neural transformer model is pre-trained with source code programs and natural language text and fine-tuned with mapped test case pairs. A mapped test case pair includes a focal method and a unit test case for the focal method. In this manner, the neural transformer model is trained to learn the semantics and statistical properties of a natural language, the syntax of a programming language and the relationships between the code elements of the programming language and the syntax of a unit test case.
    Type: Application
    Filed: October 27, 2020
    Publication date: March 3, 2022
    Inventors: JAMES DRAIN, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, MICHELE TUFANO
  • Publication number: 20220066914
    Abstract: An assert statement generator employs a neural transformer model with attention to generate candidate assert statements for a unit test method that tests a focal method. The neural transformer model is pre-trained with source code programs and natural language text and fine-tuned with test-assert triplets. A test-assert triplet includes a source code snippet that includes: (1) a unit test method with an assert placeholder; (2) the focal method; and (3) a corresponding assert statement. In this manner, the neural transformer model is trained to learn the semantics and statistical properties of a natural language, the syntax of a programming language, and the relationships between the code elements of the programming language and the syntax of an assert statement.
    Type: Application
    Filed: October 27, 2020
    Publication date: March 3, 2022
    Inventors: JAMES DRAIN, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, MICHELE TUFANO
  • Patent number: 11262984
    Abstract: A code completion tool uses a neural transformer model to generate candidate sequences to complete a line of source code. The neural transformer model is trained using a conditional language modeling objective on a large unsupervised dataset that includes source code programs written in several different programming languages. The neural transformer model is used within a beam search that predicts the most likely candidate sequences for a code snippet under development.
    Type: Grant
    Filed: November 11, 2019
    Date of Patent: March 1, 2022
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Alexey Svyatkovskiy, Shengyu Fu, Neelakantan Sundaresan, Shao Kun Deng
  • Publication number: 20220058007
    Abstract: Language interoperability between source code programs not compatible with an interprocedural static code analyzer is achieved through language-independent representations of the programs. The source code programs are transformed into respective intermediate language instructions from which a language-independent control flow graph and a language-independent type environment is created. A program compatible with the interprocedural static code analyzer is generated from the language-independent control flow graph and the language-independent type environment in order to utilize the interprocedural static code analyzer to detect memory safety faults.
    Type: Application
    Filed: November 4, 2021
    Publication date: February 24, 2022
    Inventors: SHAO KUN DENG, MATTHEW GLENN JIN, SHUVENDU LAHIRI, XIAOYU LIU, XIN SHI, NEELAKANTAN SUNDARESAN
  • Patent number: 11250038
    Abstract: An interactive question and answer (Q&A) service provides pairs of questions and corresponding answers related to the content of a web page. The service includes pre-configured Q&A pairs derived from a deep learning framework that includes a series of neural networks trained through joint and transfer learning to generate questions for a given text passage. In addition, pre-configured Q&A pairs are generated from historical web access patterns and sources related to the content of the web page.
    Type: Grant
    Filed: August 13, 2018
    Date of Patent: February 15, 2022
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Payal Bajaj, Gearard Boland, Anshul Gupta, Matthew Glenn Jin, Eduardo Enrique Noriega De Armas, Jason Shaver, Neelakantan Sundaresan, Roshanak Zilouchian Moghaddam