Patents by Inventor Neelakantan Sundaresan

Neelakantan Sundaresan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11662984
    Abstract: A data mining technique is used to find large frequently-occurring source code patterns from methods/APIs that can be used in code development. Simplified trees that represent the syntactic structure and type and method usage of a source code fragment, such as a method, are mined to find closed and maximal frequent subtrees which represent the largest frequently-occurring source code patterns or idioms associated with a particular type and method usage. These idioms are then used in an idiom web service and/or a code completion system to assist users in the development of source code programs.
    Type: Grant
    Filed: June 28, 2022
    Date of Patent: May 30, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Christian Alma Bird, Shengyu Fu, Neelakantan Sundaresan, Nina Wang, Shuo Zhang
  • Patent number: 11656851
    Abstract: The syntax elements of a source code program used to represent the context of a focal method are selected based on a priority order. The selected syntax elements are input into a fixed-size context window that is used to train a neural transformer with attention model to learn to generate source code and used by the neural transformer model to generate source code. The context window contains prioritized sequences of tokens that extend beyond the target focus in order to provide a longer visibility back into the source code program for the model to learn predictive patterns. This gives the model a file-level context of the source code program without increasing the size of the context window.
    Type: Grant
    Filed: October 22, 2021
    Date of Patent: May 23, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Colin Bruce Clement, Neelakantan Sundaresan, Alexey Svyatkovskiy, Michele Tufano
  • Publication number: 20230153226
    Abstract: A computer implemented method includes accessing performance trace data for executed code of multiple services. Symbols corresponding to functions of the executed code are identified. First sequences of functions from the identified symbols are identified and a first performance threshold for each identified first sequence of functions is computed. The method includes receiving an incoming performance trace, detecting second sequences of functions from the incoming performance trace, identifying second sequences equivalent to the first sequences, and comparing performance of the identified second sequences to the first performance threshold for each of the equivalent first sequences to identify second sequences as comprising a performance bottleneck.
    Type: Application
    Filed: November 12, 2021
    Publication date: May 18, 2023
    Inventors: Spandan Garg, Roshanak Zilouchian Moghaddam, Paul Sean Harrington, Chen Wu, Neelakantan Sundaresan
  • Patent number: 11645576
    Abstract: A code completion system predicts candidates to complete a code fragment with a tag name and/or an attribute name in source code written in a hierarchically-structured language. Candidates for predicting a tag name are based on a first-order tag Markov chain model generated from usage patterns of relationships of tag names found in a training dataset. Candidates for predicting an attribute name are based on a second-order attribute Markov chain model generated from usage patterns of sequences of attribute names associated with each tag name found in the training dataset.
    Type: Grant
    Filed: April 22, 2019
    Date of Patent: May 9, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Shengyu Fu, Neelakantan Sundaresan, Ying Zhao
  • Patent number: 11640294
    Abstract: Examples of the usage of a command of a command line interface includes the command with a set of parameters and corresponding parameter values. The examples are generated from telemetry data, which does not contain parameter values, and from web-based sources that may contain multiple parameter values. A machine learning model is used to predict the data type of a parameter value when the parameter is used with a particular command. The predicted data type is then used to select an appropriate parameter value for the example from multiple known parameter values or to generate a parameter value when no known parameter value exists.
    Type: Grant
    Filed: April 29, 2020
    Date of Patent: May 2, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Spandan Garg, Jason R. Shaver, Neelakantan Sundaresan, Roshanak Zilouchian Moghaddam
  • Publication number: 20230128008
    Abstract: A test-driven development system utilizes a neural transformer model with attention to generate method bodies for a focal method given its associated test cases, and optionally a method signature and a docstring of the focal method. The candidate method bodies are validated for syntactic correctness, tested using the given test cases, and tested with a donor class in a target system. Those candidate method bodies passing the validation and testing are then ranked based on a PLUM score that analyzes the candidate method bodies against various quality and performance metrics.
    Type: Application
    Filed: October 22, 2021
    Publication date: April 27, 2023
    Inventors: COLIN BRUCE CLEMENT, SHAO KUN DENG, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, MICHELE TUFANO
  • Publication number: 20230128200
    Abstract: The syntax elements of a source code program used to represent the context of a focal method are selected based on a priority order. The selected syntax elements are input into a fixed-size context window that is used to train a neural transformer with attention model to learn to generate source code and used by the neural transformer model to generate source code. The context window contains prioritized sequences of tokens that extend beyond the target focus in order to provide a longer visibility back into the source code program for the model to learn predictive patterns. This gives the model a file-level context of the source code program without increasing the size of the context window.
    Type: Application
    Filed: October 22, 2021
    Publication date: April 27, 2023
    Inventors: COLIN BRUCE CLEMENT, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, MICHELE TUFANO
  • Publication number: 20230121782
    Abstract: In various example embodiments, a system and method for an electronic commerce file system are provided. In example embodiments, a selection of an item contained in a folder of an electronic commerce file system is received. The item is offered for sale by an electronic commerce provider, and the electronic commerce file system resides locally on a client device. Based on a type of the folder, a set of actions are provided for selection, with the set of actions to be performed with respect to the item. A selection of an action to be performed with respect to the item is received. The action is performed with respect to the item, with the action being performed between the electronic commerce file system and the electronic commerce provider via a network.
    Type: Application
    Filed: December 20, 2022
    Publication date: April 20, 2023
    Inventors: Sandra Lynn Godsey, Neelakantan Sundaresan
  • Publication number: 20230114423
    Abstract: An automated program repair tool utilizes a neural transformer model with attention to predict the contents of a bug repair in the context of source code having a bug of an identified bug type. The neural transformer model is trained on a large unsupervised corpus of source code using a span-masking denoising optimization objective, and fine-tuned on a large supervised dataset of triplets containing a bug-type annotation, software bug, and repair. The bug-type annotation is derived from an interprocedural static code analyzer. A bug type edit centroid is computed for each bug type and used in the inference decoding phase to generate the bug repair.
    Type: Application
    Filed: November 25, 2022
    Publication date: April 13, 2023
    Inventors: SHAO KUN DENG, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, MICHELE TUFANO
  • Patent number: 11620666
    Abstract: Systems and methods for on demand local commerce are described. One example embodiment includes a device gathering location information and product interest associated with clients and client devices. The system may use location information in determining that the first plurality of client devices are within a first geographic area during a first time period, and may further use the interest information in calculating an interest level for a first product. A threshold may be identified and used in determining that the interest level for the first product exceeds the threshold. When the calculated interest level exceeds the threshold, a local commerce action is initiated. In various embodiments, the local commerce action may be a live on demand auction at a particular location, an offer associated with a geofenced area, a sales location recommendation to a merchant, or any other such local commerce action.
    Type: Grant
    Filed: May 24, 2021
    Date of Patent: April 4, 2023
    Assignee: EBAY INC.
    Inventor: Neelakantan Sundaresan
  • Patent number: 11604719
    Abstract: An automated program repair system uses a neural transformer model with attention to predict a bug-free version of a method having a source code bug identified in an associated stack trace. The neural transformer model is pre-trained with English language text and the source code of a target programming language. The pre-trained neural transformer model is trained to create synthetic bugs in bug-free methods. The bug-free methods with the synthetic bugs are executed with a test case to obtain a stack trace of the source code bug. The method with the synthetic bug, without the bug, and its stack trace are used to train the neural transformer model to predict repairs for buggy methods.
    Type: Grant
    Filed: March 25, 2021
    Date of Patent: March 14, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Colin Bruce Clement, Dawn Drain, Guillermo Serrato Castilla, Neelakantan Sundaresan
  • Publication number: 20230073633
    Abstract: A system, computer-readable storage medium storing at least one program, and computer-implemented method for providing recommendations based on social network sharing activity. Sharing activity relating to the sharing of the content item on a social network by a first user is accessed. Consumption information related to the consumption of the content item. A correlation between the sharing activity and the consumption information is determined. A recommendation is then generated based on the correlation.
    Type: Application
    Filed: September 12, 2022
    Publication date: March 9, 2023
    Inventors: Neelakantan Sundaresan, Atish Das Sarma, Si Si, Elizabeth Churchill
  • Patent number: 11599937
    Abstract: Techniques for generating a digital wardrobe are presented herein. A transceiver can be configured to receive a request having a garment identifier and a user identifier. Additionally, an access module can be configured to access a first garment model, access a body model of the user corresponding to the user identifier, and access a second garment model corresponding to the user identifier. Furthermore, a processor can be configured by a garment simulation module to position the body model inside the first garment model and the second garment model, and calculate simulated forces based on the positioning. Moreover, a rendering module can be configured to generate an image of the garment models draped on the body model based on the calculated simulated forces. Subsequently, a display module can be configured to cause presentation of the generated image on a display of a device.
    Type: Grant
    Filed: March 19, 2021
    Date of Patent: March 7, 2023
    Assignee: EBAY INC.
    Inventors: Jonathan Su, Jatin Chhugani, Mihir Naware, Neelakantan Sundaresan
  • Patent number: 11599447
    Abstract: Runtime errors in a source code program are detected in advance of execution by machine learning models. Features representing a context of a runtime error are extracted from source code programs to train a machine learning model, such as a random forest classifier, to predict the likelihood that a code snippet has a particular type of runtime error. The features are extracted from a syntax-type tree representation of each method in a program. A model is generated for distinct runtime errors, such as arithmetic overflow, and conditionally uninitialized variables.
    Type: Grant
    Filed: July 4, 2022
    Date of Patent: March 7, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Shaun Miller, Kalpathy Sitaraman Sivaraman, Neelakantan Sundaresan, Yijin Wei, Roshanak Zilouchian Moghaddam
  • Patent number: 11599345
    Abstract: Language interoperability between source code programs not compatible with an interprocedural static code analyzer is achieved through language-independent representations of the programs. The source code programs are transformed into respective intermediate language instructions from which a language-independent control flow graph and a language-independent type environment is created. A program compatible with the interprocedural static code analyzer is generated from the language-independent control flow graph and the language-independent type environment in order to utilize the interprocedural static code analyzer to detect memory safety faults.
    Type: Grant
    Filed: November 4, 2021
    Date of Patent: March 7, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Shao Kun Deng, Matthew Glenn Jin, Shuvendu Lahiri, Xiaoyu Liu, Xin Shi, Neelakantan Sundaresan
  • Publication number: 20230067364
    Abstract: A transfer learning system is used for the development of neural transformer models pertaining to software engineering tasks. The transfer learning system trains source code domain neural transformer models with attention in various configurations on a large corpus of unsupervised training dataset of source code programs and/or source code-related natural language text. A web service provides the trained models for use in developing a model that may be fine-tuned on a supervised training dataset associated with a software engineering task thereby generating a tool to perform the software engineering task.
    Type: Application
    Filed: November 6, 2022
    Publication date: March 2, 2023
    Inventors: COLIN BRUCE CLEMENT, DAWN DRAIN, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY
  • Patent number: 11586839
    Abstract: A code completion tool uses machine learning models to more precisely predict the likelihood of the parameters of a method invocation. A score is computed for each candidate variable that is used to rank the viability of a variable as the intended parameter. The score is a weighted sum of a scope factor, an edit distance factor and a declaration proximity factor. The factors are based on a scope model, a method overload model, and a weight file trained offline on a training set of source code programs utilizing various method invocations.
    Type: Grant
    Filed: December 3, 2018
    Date of Patent: February 21, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Shengyu Fu, David Poeschl, Neelakantan Sundaresan, Shuo Zhang, Ying Zhao
  • Publication number: 20230048186
    Abstract: A neural transformer model with attention is trained to predict candidates to complete a line of source code with a zero-inference capability. The model is trained on an unsupervised training dataset that includes features from source code written in multiple programming languages. The features include a file-level context and a local context, where the file-level context includes a global context, a class context, a function context, and/or a method context for each class, function and/or method of the source code programs used in the training dataset. The local context includes method bodies, function bodies, and/or stand-alone code of main method routines. From these features, the model is able to learn to predict an ordered sequence of code elements that complete a line of source code in a programming language seen and not seen during training.
    Type: Application
    Filed: November 1, 2022
    Publication date: February 16, 2023
    Inventors: COLIN BRUCE CLEMENT, SHUAI LU, NEELAKANTAN SUNDARESAN, ALEXEY SVYATKOVSKIY, DUYU TANG
  • Publication number: 20230042051
    Abstract: A distillation system extracts knowledge from a large pre-trained sequence-to-sequence neural transformer model into a smaller bi-encoder. The pre-trained sequence-to-sequence neural transformer model is trained to translate data from a first domain into a second domain on a large corpus. A teacher model is generated from the pre-trained model by fine-tuning the pre-trained neural transformer model on a smaller translation task with true translation pairs. The fine-tuned model is then used to generate augmented data values which are used with the true translation pairs to train the bi-encoder. The bi-encoder is used for perform cross-domain searches.
    Type: Application
    Filed: July 22, 2021
    Publication date: February 9, 2023
    Inventors: COLIN BRUCE CLEMENT, DAWN DRAIN, NEELAKANTAN SUNDARESAN, CHEN WU
  • Patent number: 11561810
    Abstract: An automated command assistance tool is provided for a browser-enabled command line interface of a cloud service. The automated command assistance tool provides examples illustrating the correct syntax for commands used to manage the resources of a cloud service. The command assistance tool learns the syntax of a command from usage patterns found in telemetric data, scripts and user documentation and forms templates containing a command's usage pattern and related information. The templates are used to generate examples that respond to a user query for assistance with usage of a particular command.
    Type: Grant
    Filed: January 11, 2019
    Date of Patent: January 24, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC.
    Inventors: Roshanak Zilouchian Moghaddam, Neelakantan Sundaresan, Jason Shaver