Patents by Inventor Marcus Matos

Marcus Matos has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240086390
    Abstract: Aspects of the disclosure relate to providing and maintaining efficient and effective processing of sets of work items in enterprise computing environments by optimizing distributed and parallelized batch data processing. A computing platform may initialize a monitoring process configured to monitor a pending workload in a work queue database. Subsequently, the computing platform may cause the monitoring process to query the work queue database and create one or more historical records indicative of a workload processing status associated with one or more processing workers. Then, the computing platform may identify one or more new parameter values for one or more processing parameters associated with the one or more processing workers based on the one or more historical records. Thereafter, the computing platform may configure the one or more processing workers based on the one or more new parameter values identified for the one or more processing parameters.
    Type: Application
    Filed: November 17, 2023
    Publication date: March 14, 2024
    Inventor: Marcus Matos
  • Patent number: 11874817
    Abstract: Aspects of the disclosure relate to providing and maintaining efficient and effective processing of sets of work items in enterprise computing environments by optimizing distributed and parallelized batch data processing. A computing platform may initialize a monitoring process configured to monitor a pending workload in a work queue database. Subsequently, the computing platform may cause the monitoring process to query the work queue database and create one or more historical records indicative of a workload processing status associated with one or more processing workers. Then, the computing platform may identify one or more new parameter values for one or more processing parameters associated with the one or more processing workers based on the one or more historical records. Thereafter, the computing platform may configure the one or more processing workers based on the one or more new parameter values identified for the one or more processing parameters.
    Type: Grant
    Filed: March 31, 2021
    Date of Patent: January 16, 2024
    Assignee: Bank of America Corporation
    Inventor: Marcus Matos
  • Patent number: 11789786
    Abstract: Aspects of the disclosure relate to providing and maintaining efficient and effective processing of sets of work items in enterprise computing environments by optimizing distributed and parallelized batch data processing. A computing platform may initialize at least two processing workers. Subsequently, the computing platform may cause a first processing worker to perform a first query on a work queue database and initiate parallel processing of a first set of work items. Thereafter, the computing platform may cause the second processing worker to perform a second query on the work queue database and initiate parallel processing of a second set of work items. In some instances, performing the second query on the work queue database comprises reading at least one work item that was read and locked by the first processing worker.
    Type: Grant
    Filed: September 9, 2022
    Date of Patent: October 17, 2023
    Assignee: Bank of America Corporation
    Inventor: Marcus Matos
  • Publication number: 20230004449
    Abstract: Aspects of the disclosure relate to providing and maintaining efficient and effective processing of sets of work items in enterprise computing environments by optimizing distributed and parallelized batch data processing. A computing platform may initialize at least two processing workers. Subsequently, the computing platform may cause a first processing worker to perform a first query on a work queue database and initiate parallel processing of a first set of work items. Thereafter, the computing platform may cause the second processing worker to perform a second query on the work queue database and initiate parallel processing of a second set of work items. In some instances, performing the second query on the work queue database comprises reading at least one work item that was read and locked by the first processing worker.
    Type: Application
    Filed: September 9, 2022
    Publication date: January 5, 2023
    Inventor: Marcus Matos
  • Patent number: 11474881
    Abstract: Aspects of the disclosure relate to providing and maintaining efficient and effective processing of sets of work items in enterprise computing environments by optimizing distributed and parallelized batch data processing. A computing platform may initialize at least two processing workers. Subsequently, the computing platform may cause a first processing worker to perform a first query on a work queue database and initiate parallel processing of a first set of work items. Thereafter, the computing platform may cause the second processing worker to perform a second query on the work queue database and initiate parallel processing of a second set of work items. In some instances, performing the second query on the work queue database comprises reading at least one work item that was read and locked by the first processing worker.
    Type: Grant
    Filed: March 31, 2021
    Date of Patent: October 18, 2022
    Assignee: Bank of America Corporation
    Inventor: Marcus Matos
  • Publication number: 20220318075
    Abstract: Aspects of the disclosure relate to providing and maintaining efficient and effective processing of sets of work items in enterprise computing environments by optimizing distributed and parallelized batch data processing. A computing platform may initialize at least two processing workers. Subsequently, the computing platform may cause a first processing worker to perform a first query on a work queue database and initiate parallel processing of a first set of work items. Thereafter, the computing platform may cause the second processing worker to perform a second query on the work queue database and initiate parallel processing of a second set of work items. In some instances, performing the second query on the work queue database comprises reading at least one work item that was read and locked by the first processing worker.
    Type: Application
    Filed: March 31, 2021
    Publication date: October 6, 2022
    Inventor: Marcus Matos
  • Publication number: 20220318226
    Abstract: Aspects of the disclosure relate to providing and maintaining efficient and effective processing of sets of work items in enterprise computing environments by optimizing distributed and parallelized batch data processing. A computing platform may initialize a monitoring process configured to monitor a pending workload in a work queue database. Subsequently, the computing platform may cause the monitoring process to query the work queue database and create one or more historical records indicative of a workload processing status associated with one or more processing workers. Then, the computing platform may identify one or more new parameter values for one or more processing parameters associated with the one or more processing workers based on the one or more historical records. Thereafter, the computing platform may configure the one or more processing workers based on the one or more new parameter values identified for the one or more processing parameters.
    Type: Application
    Filed: March 31, 2021
    Publication date: October 6, 2022
    Inventor: Marcus Matos