Patents by Inventor Hiren Shantilal PATEL

Hiren Shantilal PATEL has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11934874
    Abstract: A serverless query processing system receives a query and determines whether the query is a recurring query or a non-recurring query. The system may predict, in response to determining that the query is the recurring query, a peak resource requirement during an execution of the query. The system may compute, in response to determining that the query is the non-recurring query, a tight resource requirement corresponding to an amount of resources that satisfy a performance requirement over the execution of the query, where the tight resource requirement is less than the peak resource requirement. The system allocates resources to the query based on an applicable one of the peak resource requirement or the tight resource requirement. The system then starts the execution of the query using the resources.
    Type: Grant
    Filed: August 24, 2022
    Date of Patent: March 19, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Hiren Shantilal Patel, Shi Qiao, Alekh Jindal, Malay Kumar Bag, Rathijit Sen, Carlo Aldo Curino
  • Publication number: 20230342359
    Abstract: Methods of machine learning for system deployments without performance regressions are performed by systems and devices. A performance safeguard system is used to design pre-production experiments for determining the production readiness of learned models based on a pre-production budget by leveraging big data processing infrastructure and deploying a large set of learned or optimized models for its query optimizer. A pipeline for learning and training differentiates the impact of query plans with and without the learned or optimized models, selects plan differences that are likely to lead to most dramatic performance difference, runs a constrained set of pre-production experiments to empirically observe the runtime performance, and finally picks the models that are expected to lead to consistently improved performance for deployment. The performance safeguard system enables safe deployment not just for learned or optimized models but also for additional of other ML-for-Systems features.
    Type: Application
    Filed: June 30, 2023
    Publication date: October 26, 2023
    Inventors: Irene Rogan SHAFFER, Remmelt Herbert Lieve AMMERLAAN, Gilbert ANTONIUS, Marc T. FRIEDMAN, Abhishek ROY, Lucas ROSENBLATT, Vijay Kumar RAMANI, Shi QIAO, Alekh JINDAL, Peter ORENBERG, H M Sajjad Hossain, Soundararajan Srinivasan, Hiren Shantilal PATEL, Markus WEIMER
  • Patent number: 11748350
    Abstract: Methods of machine learning for system deployments without performance regressions are performed by systems and devices. A performance safeguard system is used to design pre-production experiments for determining the production readiness of learned models based on a pre-production budget by leveraging big data processing infrastructure and deploying a large set of learned or optimized models for its query optimizer. A pipeline for learning and training differentiates the impact of query plans with and without the learned or optimized models, selects plan differences that are likely to lead to most dramatic performance difference, runs a constrained set of pre-production experiments to empirically observe the runtime performance, and finally picks the models that are expected to lead to consistently improved performance for deployment. The performance safeguard system enables safe deployment not just for learned or optimized models but also for additional of other ML-for-Systems features.
    Type: Grant
    Filed: April 3, 2020
    Date of Patent: September 5, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Irene Rogan Shaffer, Remmelt Herbert Lieve Ammerlaan, Gilbert Antonius, Marc T. Friedman, Abhishek Roy, Lucas Rosenblatt, Vijay Kumar Ramani, Shi Qiao, Alekh Jindal, Peter Orenberg, H M Sajjad Hossain, Soundararajan Srinivasan, Hiren Shantilal Patel, Markus Weimer
  • Publication number: 20220413914
    Abstract: A serverless query processing system receives a query and determines whether the query is a recurring query or a non-recurring query. The system may predict, in response to determining that the query is the recurring query, a peak resource requirement during an execution of the query. The system may compute, in response to determining that the query is the non-recurring query, a tight resource requirement corresponding to an amount of resources that satisfy a performance requirement over the execution of the query, where the tight resource requirement is less than the peak resource requirement. The system allocates resources to the query based on an applicable one of the peak resource requirement or the tight resource requirement. The system then starts the execution of the query using the resources.
    Type: Application
    Filed: August 24, 2022
    Publication date: December 29, 2022
    Inventors: Hiren Shantilal PATEL, Shi QIAO, Alekh JINDAL, Malay Kumar BAG, Rathijit SEN, Carlo Aldo CURINO
  • Patent number: 11455192
    Abstract: A serverless query processing system receives a query and determines whether the query is a recurring query or a non-recurring query. The system may predict, in response to determining that the query is the recurring query, a peak resource requirement during an execution of the query. The system may compute, in response to determining that the query is the non-recurring query, a tight resource requirement corresponding to an amount of resources that satisfy a performance requirement over the execution of the query, where the tight resource requirement is less than the peak resource requirement. The system allocates resources to the query based on an applicable one of the peak resource requirement or the tight resource requirement. The system then starts the execution of the query using the resources.
    Type: Grant
    Filed: November 27, 2019
    Date of Patent: September 27, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Hiren Shantilal Patel, Shi Qiao, Alekh Jindal, Malay Kumar Bag, Rathijit Sen, Carlo Aldo Curino
  • Patent number: 11416487
    Abstract: Techniques are described herein that are capable of selecting checkpoints of a database job. For instance, at compile time, temporal indicators associated with the query plans of the database job are determined. Each temporal indicator indicates first and second subsets of stages of the respective query plan. Values of attributes of each stage in at least each first subset are predicted using a machine learning technique. At the compile time, candidate stage(s) for each query plan are identified based on the respective candidate stage being a child of stage(s) in the corresponding second subset or not being a child of another stage in the respective query plan. The candidate stage(s) for each query plan are selectively chosen as respective checkpoint(s) based on whether the values of the attributes of each stage in at least the first subset of the stages of the respective query plan satisfy one or more criteria.
    Type: Grant
    Filed: September 22, 2020
    Date of Patent: August 16, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Yiwen Zhu, Alekh Jindal, Malay Kumar Bag, Hiren Shantilal Patel
  • Publication number: 20220100763
    Abstract: Solutions for optimizing job runtimes via prediction-based token allocation includes receiving training data comprising historical run data, the historical run data comprising job characteristics, runtime results, and a token count for each of a plurality of prior jobs, and the job characteristics comprising an intermediate representation and job graph data; based at least on the training data, training a token estimator, the token estimator comprising a machine learning (ML) model; receiving job characteristics for a user-submitted job; based at least on the received job characteristics, generating, with the token estimator, token prediction data for the user-submitted job; selecting a token count for the user-submitted job, based at least on the token prediction data; identifying the selected token count to an execution environment; and executing, with the execution environment, the user-submitted job in accordance with the selected token count.
    Type: Application
    Filed: September 30, 2020
    Publication date: March 31, 2022
    Inventors: Rathijit SEN, Alekh JINDAL, Anish Yatin PIMPLEY, Shuo LI, Anubha SRIVASTAVA, Vishal Lalchand ROHRA, Yi ZHU, Hiren Shantilal PATEL, Shi QIAO, Marc Todd FRIEDMAN, Clemens Alden SZYPERSKI
  • Publication number: 20220092067
    Abstract: Techniques are described herein that are capable of selecting checkpoints of a database job. For instance, at compile time, temporal indicators associated with the query plans of the database job are determined. Each temporal indicator indicates first and second subsets of stages of the respective query plan. Values of attributes of each stage in at least each first subset are predicted using a machine learning technique. At the compile time, candidate stage(s) for each query plan are identified based on the respective candidate stage being a child of stage(s) in the corresponding second subset or not being a child of another stage in the respective query plan. The candidate stage(s) for each query plan are selectively chosen as respective checkpoint(s) based on whether the values of the attributes of each stage in at least the first subset of the stages of the respective query plan satisfy one or more criteria.
    Type: Application
    Filed: September 22, 2020
    Publication date: March 24, 2022
    Inventors: Yiwen Zhu, Alekh Jindal, Malay Kumar Bag, Hiren Shantilal Patel
  • Publication number: 20210263932
    Abstract: Methods of machine learning for system deployments without performance regressions are performed by systems and devices. A performance safeguard system is used to design pre-production experiments for determining the production readiness of learned models based on a pre-production budget by leveraging big data processing infrastructure and deploying a large set of learned or optimized models for its query optimizer. A pipeline for learning and training differentiates the impact of query plans with and without the learned or optimized models, selects plan differences that are likely to lead to most dramatic performance difference, runs a constrained set of pre-production experiments to empirically observe the runtime performance, and finally picks the models that are expected to lead to consistently improved performance for deployment. The performance safeguard system enables safe deployment not just for learned or optimized models but also for additional of other ML-for-Systems features.
    Type: Application
    Filed: April 3, 2020
    Publication date: August 26, 2021
    Inventors: Irene Rogan Shaffer, Remmelt Herbert Lieve Ammerlaan, Gilbert Antonius, Marc T. Friedman, Abhishek Roy, Lucas Rosenblatt, Vijay Kumar Ramani, Shi Qiao, Alekh Jindal, Peter Orenberg, H M Sajjad Hossain, Soundararajan Srinivasan, Hiren Shantilal Patel, Markus Weimer
  • Publication number: 20210096915
    Abstract: A serverless query processing system receives a query and determines whether the query is a recurring query or a non-recurring query. The system may predict, in response to determining that the query is the recurring query, a peak resource requirement during an execution of the query. The system may compute, in response to determining that the query is the non-recurring query, a tight resource requirement corresponding to an amount of resources that satisfy a performance requirement over the execution of the query, where the tight resource requirement is less than the peak resource requirement. The system allocates resources to the query based on an applicable one of the peak resource requirement or the tight resource requirement. The system then starts the execution of the query using the resources.
    Type: Application
    Filed: November 27, 2019
    Publication date: April 1, 2021
    Inventors: Hiren Shantilal PATEL, Shi QIAO, Alekh JINDAL, Malay Kumar BAG, Rathijit SEN, Carlo Aldo CURINO
  • Patent number: 10726014
    Abstract: Described herein is a system and method for selecting subexpressions to be materialized. For a predefined storage budget, subexpressions of a set of candidate subexpressions to be materialized to minimize query evaluation cost are selected based upon a calculated utility of the set of candidate subexpressions, interactions of the candidate subexpressions, and, a cost of evaluating the candidate subexpressions. Based upon the subexpressions selected to be materialized, subexpression(s) of the set of candidate subexpressions to use when evaluating particular queries of the set of queries to minimize query evaluation cost are determined.
    Type: Grant
    Filed: January 30, 2018
    Date of Patent: July 28, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Alekh Jindal, Konstantinos Karanasos, Hiren Shantilal Patel, Sriram S Rao
  • Publication number: 20190236189
    Abstract: Described herein is a system and method for selecting subexpressions to be materialized. For a predefined storage budget, subexpressions of a set of candidate subexpressions to be materialized to minimize query evaluation cost are selected based upon a calculated utility of the set of candidate subexpressions, interactions of the candidate subexpressions, and, a cost of evaluating the candidate subexpressions. Based upon the subexpressions selected to be materialized, subexpression(s) of the set of candidate subexpressions to use when evaluating particular queries of the set of queries to minimize query evaluation cost are determined.
    Type: Application
    Filed: January 30, 2018
    Publication date: August 1, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Alekh JINDAL, Konstantinos KARANASOS, Hiren Shantilal PATEL, Sriram S RAO