Patents by Inventor Scott Louis BROKAW

Scott Louis BROKAW has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11755926
    Abstract: A method, computer system, and a computer program product for data pipeline prioritization is provided. Embodiments may include receiving, by a cognitive rules engine, one or more data pipelines. Embodiments may then include analyzing, using a computational method of the cognitive rules engine, the one or more data pipelines. Embodiments may lastly include prioritizing the one or more data pipelines based on a result of the computational method of the cognitive rules engine.
    Type: Grant
    Filed: February 28, 2019
    Date of Patent: September 12, 2023
    Assignee: International Business Machines Corporation
    Inventors: Ritesh Kumar Gupta, Namit Kabra, Likhitha Maddirala, Eric Allen Jacobson, Scott Louis Brokaw, Jo Ramos
  • Patent number: 11288601
    Abstract: A self-learning computer-based system has access to multiple runtime modules that are each capable of performing a particular algorithm. Each runtime module implements the algorithm with different code or runs in a different runtime environment. The system responds to a request to run the algorithm by selecting the runtime module or runtime environment that the system predicts will provide the most desirable results based on parameters like accuracy, performance, cost, resource-efficiency, or policy compliance. The system learns how to make such predictions through training sessions conducted by a machine-learning component. This training teaches the system that previous module selections produced certain types of results in the presence of certain conditions. After determining whether similar conditions currently exist, the system uses rules inferred from the training sessions to select the runtime module most likely to produce desired results.
    Type: Grant
    Filed: March 21, 2019
    Date of Patent: March 29, 2022
    Assignee: International Business Machines Corporation
    Inventors: Ritesh Kumar Gupta, Namit Kabra, Eric Allen Jacobson, Scott Louis Brokaw, Jo Arao Ramos
  • Patent number: 11194629
    Abstract: A method includes: receiving, by a computer device, resource request for a data integration job, wherein the resource request is received from a job executor module and defines processes of the data integration job; allocating, by the computer device, containers for the processes of the data integration job; launching, by the computer device, a respective wrapper script on each respective one of the containers after allocating the respective one of the containers; and transmitting, by the computer device and in response to the allocating, node details to the job executor module. In embodiments, the wrapper script running on the container is configured to repeatedly check a predefined location for process commands from a job executor. After the resource manager allocates all the containers for a data integration job according to a resource request, the job executor writes the process commands to the predefined location.
    Type: Grant
    Filed: December 6, 2018
    Date of Patent: December 7, 2021
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Krishna Kishore Bonagiri, Eric Allen Jacobson, Ritesh Kumar Gupta, Indrani Ghatare, Scott Louis Brokaw
  • Patent number: 11150956
    Abstract: A set of resources required to process a data integration job is determined. In response to determining that the set of resources is not available, queue occupation, for each queue in the computing environment, is predicted. Queue occupation is a workload of queue resources for a future time based on a previous workload. A best queue is selected based on the predicted queue occupation. The best queue is the queue or queues in the computing environment available to be assigned to process the data integration job without preemption. The data integration job is processed using the best queue. It is determined whether a preemption event occurred causing the removal of resources from the best queue. A checkpoint is created in response to determining that a preemption event occurred. The checkpoint indicates the last successful operation completed and provides a point where processing can resume when resources become available.
    Type: Grant
    Filed: May 21, 2019
    Date of Patent: October 19, 2021
    Assignee: International Business Machines Corporation
    Inventors: Krishna Kishore Bonagiri, Eric A. Jacobson, Ritesh Kumar Gupta, Scott Louis Brokaw
  • Publication number: 20200371839
    Abstract: A set of resources required to process a data integration job is determined. In response to determining that the set of resources is not available, queue occupation, for each queue in the computing environment, is predicted. Queue occupation is a workload of queue resources for a future time based on a previous workload. A best queue is selected based on the predicted queue occupation. The best queue is the queue or queues in the computing environment available to be assigned to process the data integration job without preemption. The data integration job is processed using the best queue. It is determined whether a preemption event occurred causing the removal of resources from the best queue. A checkpoint is created in response to determining that a preemption event occurred. The checkpoint indicates the last successful operation completed and provides a point where processing can resume when resources become available.
    Type: Application
    Filed: May 21, 2019
    Publication date: November 26, 2020
    Inventors: Krishna Kishore Bonagiri, Eric A. Jacobson, Ritesh Kumar Gupta, Scott Louis Brokaw
  • Publication number: 20200302343
    Abstract: A self-learning computer-based system has access to multiple runtime modules that are each capable of performing a particular algorithm. Each runtime module implements the algorithm with different code or runs in a different runtime environment. The system responds to a request to run the algorithm by selecting the runtime module or runtime environment that the system predicts will provide the most desirable results based on parameters like accuracy, performance, cost, resource-efficiency, or policy compliance. The system learns how to make such predictions through training sessions conducted by a machine-learning component. This training teaches the system that previous module selections produced certain types of results in the presence of certain conditions. After determining whether similar conditions currently exist, the system uses rules inferred from the training sessions to select the runtime module most likely to produce desired results.
    Type: Application
    Filed: March 21, 2019
    Publication date: September 24, 2020
    Inventors: Ritesh Kumar Gupta, Namit Kabra, Eric Allen Jacobson, Scott Louis Brokaw, Jo Arao Ramos
  • Publication number: 20200279173
    Abstract: A method, computer system, and a computer program product for data pipeline prioritization is provided. Embodiments may include receiving, by a cognitive rules engine, one or more data pipelines. Embodiments may then include analyzing, using a computational method of the cognitive rules engine, the one or more data pipelines. Embodiments may lastly include prioritizing the one or more data pipelines based on a result of the computational method of the cognitive rules engine.
    Type: Application
    Filed: February 28, 2019
    Publication date: September 3, 2020
    Inventors: Ritesh Kumar Gupta, Namit Kabra, Likhitha Maddirala, Eric Allen Jacobson, Scott Louis Brokaw, Jo Ramos
  • Publication number: 20200183751
    Abstract: A method includes: receiving, by a computer device, resource request for a data integration job, wherein the resource request is received from a job executor module and defines processes of the data integration job; allocating, by the computer device, containers for the processes of the data integration job; launching, by the computer device, a respective wrapper script on each respective one of the containers after allocating the respective one of the containers; and transmitting, by the computer device and in response to the allocating, node details to the job executor module. In embodiments, the wrapper script running on the container is configured to repeatedly check a predefined location for process commands from a job executor. After the resource manager allocates all the containers for a data integration job according to a resource request, the job executor writes the process commands to the predefined location.
    Type: Application
    Filed: December 6, 2018
    Publication date: June 11, 2020
    Inventors: Krishna Kishore BONAGIRI, Eric Allen JACOBSON, Ritesh Kumar GUPTA, Indrani GHATARE, Scott Louis BROKAW