Patents by Inventor Marcus Matos
Marcus Matos has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12204518Abstract: Aspects of the disclosure relate to providing and maintaining efficient and effective processing of sets of work items in enterprise computing environments by optimizing distributed and parallelized batch data processing. A computing platform may initialize a monitoring process configured to monitor a pending workload in a work queue database. Subsequently, the computing platform may cause the monitoring process to query the work queue database and create one or more historical records indicative of a workload processing status associated with one or more processing workers. Then, the computing platform may identify one or more new parameter values for one or more processing parameters associated with the one or more processing workers based on the one or more historical records. Thereafter, the computing platform may configure the one or more processing workers based on the one or more new parameter values identified for the one or more processing parameters.Type: GrantFiled: November 17, 2023Date of Patent: January 21, 2025Assignee: Bank of America CorporationInventor: Marcus Matos
-
Publication number: 20240427916Abstract: Arrangements for dynamic variable determination and labeling are provided. In some aspects, a computing platform may receive historical user data from a plurality of data sources. The computing platform may train, using the historical user data, a machine learning model to generate a plurality of dynamic variable profiles and evaluate data to detect potential unauthorized activity. One or more dynamic variable profiles of the generated plurality of dynamic variable profiles may be associated with a user. User specific data may be received and may include user identifying data and a request for a user event. The user specific data may be input to the machine learning model and, upon execution of the model, the model may output a determination of whether an anomaly exists in the user specific data. If an anomaly is detected, a mitigating action may be identified and transmitted to one or more computing devices for execution.Type: ApplicationFiled: June 20, 2023Publication date: December 26, 2024Inventors: Marcus Matos, Vijaya L. Vemireddy, Daniel Joseph Serna, Lee Ann Proud
-
Publication number: 20240428078Abstract: A computing platform may train, using unsupervised learning techniques, a synthetic identity detection model to detect attempts to generate synthetic identities. The computing platform may receive identity information corresponding to an identity generation request. The computing platform may use the synthetic identity detection model to: 1) generate information clusters corresponding to the identity information, 2) compare a difference between actual and expected information clusters to an anomaly detection threshold, 3) based on identifying that the number of information clusters meets or exceeds the anomaly detection threshold, generate a threat score corresponding to the identity information, 4) compare the threat score to a synthetic identity detection threshold, and 5) based on identifying that the threat score meets or exceeds the synthetic identity detection threshold, identify a synthetic identity generation attempt.Type: ApplicationFiled: June 20, 2023Publication date: December 26, 2024Applicant: Bank of America CorporationInventors: Vijaya L. Vemireddy, Marcus Matos, Daniel Joseph Serna, Kevin Delson
-
Publication number: 20240086390Abstract: Aspects of the disclosure relate to providing and maintaining efficient and effective processing of sets of work items in enterprise computing environments by optimizing distributed and parallelized batch data processing. A computing platform may initialize a monitoring process configured to monitor a pending workload in a work queue database. Subsequently, the computing platform may cause the monitoring process to query the work queue database and create one or more historical records indicative of a workload processing status associated with one or more processing workers. Then, the computing platform may identify one or more new parameter values for one or more processing parameters associated with the one or more processing workers based on the one or more historical records. Thereafter, the computing platform may configure the one or more processing workers based on the one or more new parameter values identified for the one or more processing parameters.Type: ApplicationFiled: November 17, 2023Publication date: March 14, 2024Inventor: Marcus Matos
-
Patent number: 11874817Abstract: Aspects of the disclosure relate to providing and maintaining efficient and effective processing of sets of work items in enterprise computing environments by optimizing distributed and parallelized batch data processing. A computing platform may initialize a monitoring process configured to monitor a pending workload in a work queue database. Subsequently, the computing platform may cause the monitoring process to query the work queue database and create one or more historical records indicative of a workload processing status associated with one or more processing workers. Then, the computing platform may identify one or more new parameter values for one or more processing parameters associated with the one or more processing workers based on the one or more historical records. Thereafter, the computing platform may configure the one or more processing workers based on the one or more new parameter values identified for the one or more processing parameters.Type: GrantFiled: March 31, 2021Date of Patent: January 16, 2024Assignee: Bank of America CorporationInventor: Marcus Matos
-
Patent number: 11789786Abstract: Aspects of the disclosure relate to providing and maintaining efficient and effective processing of sets of work items in enterprise computing environments by optimizing distributed and parallelized batch data processing. A computing platform may initialize at least two processing workers. Subsequently, the computing platform may cause a first processing worker to perform a first query on a work queue database and initiate parallel processing of a first set of work items. Thereafter, the computing platform may cause the second processing worker to perform a second query on the work queue database and initiate parallel processing of a second set of work items. In some instances, performing the second query on the work queue database comprises reading at least one work item that was read and locked by the first processing worker.Type: GrantFiled: September 9, 2022Date of Patent: October 17, 2023Assignee: Bank of America CorporationInventor: Marcus Matos
-
Publication number: 20230004449Abstract: Aspects of the disclosure relate to providing and maintaining efficient and effective processing of sets of work items in enterprise computing environments by optimizing distributed and parallelized batch data processing. A computing platform may initialize at least two processing workers. Subsequently, the computing platform may cause a first processing worker to perform a first query on a work queue database and initiate parallel processing of a first set of work items. Thereafter, the computing platform may cause the second processing worker to perform a second query on the work queue database and initiate parallel processing of a second set of work items. In some instances, performing the second query on the work queue database comprises reading at least one work item that was read and locked by the first processing worker.Type: ApplicationFiled: September 9, 2022Publication date: January 5, 2023Inventor: Marcus Matos
-
Patent number: 11474881Abstract: Aspects of the disclosure relate to providing and maintaining efficient and effective processing of sets of work items in enterprise computing environments by optimizing distributed and parallelized batch data processing. A computing platform may initialize at least two processing workers. Subsequently, the computing platform may cause a first processing worker to perform a first query on a work queue database and initiate parallel processing of a first set of work items. Thereafter, the computing platform may cause the second processing worker to perform a second query on the work queue database and initiate parallel processing of a second set of work items. In some instances, performing the second query on the work queue database comprises reading at least one work item that was read and locked by the first processing worker.Type: GrantFiled: March 31, 2021Date of Patent: October 18, 2022Assignee: Bank of America CorporationInventor: Marcus Matos
-
Publication number: 20220318226Abstract: Aspects of the disclosure relate to providing and maintaining efficient and effective processing of sets of work items in enterprise computing environments by optimizing distributed and parallelized batch data processing. A computing platform may initialize a monitoring process configured to monitor a pending workload in a work queue database. Subsequently, the computing platform may cause the monitoring process to query the work queue database and create one or more historical records indicative of a workload processing status associated with one or more processing workers. Then, the computing platform may identify one or more new parameter values for one or more processing parameters associated with the one or more processing workers based on the one or more historical records. Thereafter, the computing platform may configure the one or more processing workers based on the one or more new parameter values identified for the one or more processing parameters.Type: ApplicationFiled: March 31, 2021Publication date: October 6, 2022Inventor: Marcus Matos
-
Publication number: 20220318075Abstract: Aspects of the disclosure relate to providing and maintaining efficient and effective processing of sets of work items in enterprise computing environments by optimizing distributed and parallelized batch data processing. A computing platform may initialize at least two processing workers. Subsequently, the computing platform may cause a first processing worker to perform a first query on a work queue database and initiate parallel processing of a first set of work items. Thereafter, the computing platform may cause the second processing worker to perform a second query on the work queue database and initiate parallel processing of a second set of work items. In some instances, performing the second query on the work queue database comprises reading at least one work item that was read and locked by the first processing worker.Type: ApplicationFiled: March 31, 2021Publication date: October 6, 2022Inventor: Marcus Matos