Patents by Inventor Brad E. Romano

Brad E. Romano has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11151099
    Abstract: A data structure management system includes a first database, a second database, and a processing engine. The first database includes a first file with a first term and a corresponding first metadata, and a second file with the first term and a corresponding second metadata. The processing engine extracts the first file and the second file from the first database in a first format. It links the first term with the first metadata from the first file and the second metadata from the second file. It transforms the extracted first file and second file from the first format into a second format while maintaining the link between the first term, the first metadata, and the second metadata. It then exports the transformed first file and second file to a second database in the second format with the link between the first term, the first metadata, and the second metadata intact.
    Type: Grant
    Filed: May 26, 2020
    Date of Patent: October 19, 2021
    Assignee: Bank of America Corporation
    Inventors: Brad E. Romano, Shashi Thanikella
  • Patent number: 10985975
    Abstract: A parallel processing device includes a parallel processing engine implemented by a processor. The parallel processing engine is configured to execute a shell script for each particular processing job in a queue of processing jobs to run. The shell script is configured to dynamically generate a configuration file for each particular processing job. The configuration file instructs a network of computing systems to run the particular processing job using a particular number of parallel partitions corresponding to a parallel partitions parameter associated with the particular job. The configuration file includes randomized scratch directories for computing nodes within the network of computing systems and a calculated container size for the particular processing job. Each processing job is run on the network of computing systems according to the dynamically-generated configuration file of the particular processing job.
    Type: Grant
    Filed: April 27, 2020
    Date of Patent: April 20, 2021
    Assignee: Bank of America Corporation
    Inventors: Brad E. Romano, Shashi Thanikella
  • Patent number: 10795880
    Abstract: A system for communication between two or more computer programs is disclosed. The system includes a memory, an interface, and a processor. The memory stores a first file, expected metadata for the first file, and expected metadata for one or more fields in the first file. The interface receives a file from a computer program. The file comprises fields that each comprise information provided by one or more sources. The processor executes a second computer program which extracts a first set of file metadata from the received file, compares the extracted first set of file metadata to the expected metadata, and determines if the extracted first set of file metadata corresponds to the expected metadata. If the extracted first set of file metadata corresponds to the expected metadata for the first file, then the processor performs analogous comparisons at a field level and stores the first file in the memory.
    Type: Grant
    Filed: October 28, 2019
    Date of Patent: October 6, 2020
    Assignee: Bank of America Corporation
    Inventors: Brad E. Romano, Shashi Thanikella
  • Publication number: 20200285622
    Abstract: A data structure management system includes a first database, a second database, and a processing engine. The first database includes a first file with a first term and a corresponding first metadata, and a second file with the first term and a corresponding second metadata. The processing engine extracts the first file and the second file from the first database in a first format. It links the first term with the first metadata from the first file and the second metadata from the second file. It transforms the extracted first file and second file from the first format into a second format while maintaining the link between the first term, the first metadata, and the second metadata. It then exports the transformed first file and second file to a second database in the second format with the link between the first term, the first metadata, and the second metadata intact.
    Type: Application
    Filed: May 26, 2020
    Publication date: September 10, 2020
    Inventors: Brad E. Romano, Shashi Thanikella
  • Publication number: 20200259706
    Abstract: A parallel processing device includes a parallel processing engine implemented by a processor. The parallel processing engine is configured to execute a shell script for each particular processing job in a queue of processing jobs to run. The shell script is configured to dynamically generate a configuration file for each particular processing job. The configuration file instructs a network of computing systems to run the particular processing job using a particular number of parallel partitions corresponding to a parallel partitions parameter associated with the particular job. The configuration file includes randomized scratch directories for computing nodes within the network of computing systems and a calculated container size for the particular processing job. Each processing job is run on the network of computing systems according to the dynamically-generated configuration file of the particular processing job.
    Type: Application
    Filed: April 27, 2020
    Publication date: August 13, 2020
    Inventors: Brad E. Romano, Shashi Thanikella
  • Patent number: 10698869
    Abstract: A data structure management system includes a first database, a second database, and a processing engine. The first database includes a first file with a first term and a corresponding first metadata, and a second file with the first term and a corresponding second metadata. The processing engine extracts the first file and the second file from the first database in a first format. It links the first term with the first metadata from the first file and the second metadata from the second file. It transforms the extracted first file and second file from the first format into a second format while maintaining the link between the first term, the first metadata, and the second metadata. It then exports the transformed first file and second file to a second database in the second format with the link between the first term, the first metadata, and the second metadata intact.
    Type: Grant
    Filed: September 27, 2016
    Date of Patent: June 30, 2020
    Assignee: Bank of America Corporation
    Inventors: Brad E. Romano, Shashi Thanikella
  • Patent number: 10666510
    Abstract: A parallel processing device includes a parallel processing engine implemented by a processor. The parallel processing engine is configured to execute a shell script for each particular processing job in a queue of processing jobs to run. The shell script is configured to dynamically generate a configuration file for each particular processing job. The configuration file instructs a network of computing systems to run the particular processing job using a particular number of parallel partitions corresponding to a parallel partitions parameter associated with the particular job. The configuration file includes randomized scratch directories for computing nodes within the network of computing systems and a calculated container size for the particular processing job. Each processing job is run on the network of computing systems according to the dynamically-generated configuration file of the particular processing job.
    Type: Grant
    Filed: October 30, 2018
    Date of Patent: May 26, 2020
    Assignee: Bank of America Corporation
    Inventors: Brad E. Romano, Shashi Thanikella
  • Publication number: 20200136899
    Abstract: A parallel processing device includes a parallel processing engine implemented by a processor. The parallel processing engine is configured to execute a shell script for each particular processing job in a queue of processing jobs to run. The shell script is configured to dynamically generate a configuration file for each particular processing job. The configuration file instructs a network of computing systems to run the particular processing job using a particular number of parallel partitions corresponding to a parallel partitions parameter associated with the particular job. The configuration file includes randomized scratch directories for computing nodes within the network of computing systems and a calculated container size for the particular processing job. Each processing job is run on the network of computing systems according to the dynamically-generated configuration file of the particular processing job.
    Type: Application
    Filed: October 30, 2018
    Publication date: April 30, 2020
    Inventors: Brad E. Romano, Shashi Thanikella
  • Publication number: 20200089845
    Abstract: A system for communication between two or more computer programs is disclosed. The system includes a memory, an interface, and a processor. The memory stores a first file, expected metadata for the first file, and expected metadata for one or more fields in the first file. The interface receives a file from a computer program. The file comprises fields that each comprise information provided by one or more sources. The processor executes a second computer program which extracts a first set of file metadata from the received file, compares the extracted first set of file metadata to the expected metadata, and determines if the extracted first set of file metadata corresponds to the expected metadata. If the extracted first set of file metadata corresponds to the expected metadata for the first file, then the processor performs analogous comparisons at a field level and stores the first file in the memory.
    Type: Application
    Filed: October 28, 2019
    Publication date: March 19, 2020
    Inventors: Brad E. Romano, Shashi Thanikella
  • Patent number: 10481836
    Abstract: An adaptive machine learning system for predicting file controls includes a memory, an interface, and a processor. The memory stores a plurality of controls for incoming files and the interface receives a first file and a second file. The first file has a first property and the second file has a second property. The processor determines a type for each of the first property and the second property, wherein the type of each property is related to a first file control. The processor also determines that the first property and the second property each satisfy the first file control. If the value of the first property and the second property are above a first threshold, the processor changes a value of the first control for incoming files.
    Type: Grant
    Filed: September 27, 2016
    Date of Patent: November 19, 2019
    Assignee: Bank of America Corporation
    Inventors: Brad E. Romano, John R. Sampson, Shashi Thanikella
  • Patent number: 10459911
    Abstract: A system for communication between two or more computer programs is disclosed. The system includes a memory, an interface, and a processor. The memory stores a first file, expected metadata for the first file, and expected metadata for one or more fields in the first file. The interface receives a file from a computer program. The file comprises fields that each comprise information provided by one or more sources. The processor executes a second computer program which extracts a first set of file metadata from the received file, compares the extracted first set of file metadata to the expected metadata, and determines if the extracted first set of file metadata corresponds to the expected metadata. If the extracted first set of file metadata corresponds to the expected metadata for the first file, then the processor performs analogous comparisons at a field level and stores the first file in the memory.
    Type: Grant
    Filed: September 27, 2016
    Date of Patent: October 29, 2019
    Assignee: Bank of America Corporation
    Inventors: Brad E. Romano, Shashi Thanikella
  • Patent number: 10255338
    Abstract: A system for file management in data structures is disclosed. The system includes a memory, an extraction engine, an enrichment engine, a portal, and a transmission engine. The memory stores a first database and a memory. The extraction engine extracts columns and fields and associates them with extracted terms. The enrichment engine determines an end-to-end dataflow of the data from extracted metadata and loads the associated data into the memory. Enrichment engine performs either a full load comprising loading all the associated data onto the memory or a delta load comprising comparing the extracted data with data stored in the memory and loading any different data from the extracted data onto the memory. The enrichment engine also generates a journal recording metadata associated with the full or delta load. A portal displays the end-to-end dataflow of the associated data and a transmission engine transmits communication identifying incomplete associated data.
    Type: Grant
    Filed: September 27, 2016
    Date of Patent: April 9, 2019
    Assignee: Bank of America Corporation
    Inventors: Brad E. Romano, Shashi Thanikella
  • Patent number: 10089313
    Abstract: According to one embodiment, a system for converting data integration system (“DIS”) files comprises a memory operable to store data associated with at least one DIS and a processor communicatively coupled to the memory and operable to convert files associated with a first DIS to files associated with a second DIS. The operating system used by the first DIS is different from the operating system used by the second DIS. The processor converts the files by being operable to determine differences between the first DIS and the second DIS, determine a set of transformation rules based on the differences, create a conversion algorithm based on the set of transformation rules, and execute the conversion algorithm to convert the files. The system is further operable to execute the second DIS such that the second DIS uses the converted data integration files.
    Type: Grant
    Filed: February 19, 2015
    Date of Patent: October 2, 2018
    Assignee: Bank of America Corporation
    Inventors: Brad E. Romano, John Abraham
  • Publication number: 20180089251
    Abstract: A system for communication between two or more computer programs is disclosed. The system includes a memory, an interface, and a processor. The memory stores a first file, expected metadata for the first file, and expected metadata for one or more fields in the first file. The interface receives a file from a computer program. The file comprises fields that each comprise information provided by one or more sources. The processor executes a second computer program which extracts a first set of file metadata from the received file, compares the extracted first set of file metadata to the expected metadata, and determines if the extracted first set of file metadata corresponds to the expected metadata. If the extracted first set of file metadata corresponds to the expected metadata for the first file, then the processor performs analogous comparisons at a field level and stores the first file in the memory.
    Type: Application
    Filed: September 27, 2016
    Publication date: March 29, 2018
    Inventors: Brad E. Romano, Shashi Thanikella
  • Publication number: 20180089579
    Abstract: An adaptive machine learning system for predicting file controls includes a memory, an interface, and a processor. The memory stores a plurality of controls for incoming files and the interface receives a first file and a second file. The first file has a first property and the second file has a second property. The processor determines a type for each of the first property and the second property, wherein the type of each property is related to a first file control. The processor also determines that the first property and the second property each satisfy the first file control. If the value of the first property and the second property are above a first threshold, the processor changes a value of the first control for incoming files.
    Type: Application
    Filed: September 27, 2016
    Publication date: March 29, 2018
    Inventors: BRAD E. ROMANO, JOHN R. SAMPSON, SHASHI THANIKELLA
  • Publication number: 20180089185
    Abstract: A data structure management system includes a first database, a second database, and a processing engine. The first database includes a first file with a first term and a corresponding first metadata, and a second file with the first term and a corresponding second metadata. The processing engine extracts the first file and the second file from the first database in a first format. It links the first term with the first metadata from the first file and the second metadata from the second file. It transforms the extracted first file and second file from the first format into a second format while maintaining the link between the first term, the first metadata, and the second metadata. It then exports the transformed first file and second file to a second database in the second format with the link between the first term, the first metadata, and the second metadata intact.
    Type: Application
    Filed: September 27, 2016
    Publication date: March 29, 2018
    Inventors: Brad E. Romano, Shashi Thanikella
  • Publication number: 20180089293
    Abstract: A system for file management in data structures is disclosed. The system includes a memory, an extraction engine, an enrichment engine, a portal, and a transmission engine. The memory stores a first database and a memory. The extraction engine extracts columns and fields and associates them with extracted terms. The enrichment engine determines an end-to-end dataflow of the data from extracted metadata and loads the associated data into the memory. Enrichment engine performs either a full load comprising loading all the associated data onto the memory or a delta load comprising comparing the extracted data with data stored in the memory and loading any different data from the extracted data onto the memory. The enrichment engine also generates a journal recording metadata associated with the full or delta load. A portal displays the end-to-end dataflow of the associated data and a transmission engine transmits communication identifying incomplete associated data.
    Type: Application
    Filed: September 27, 2016
    Publication date: March 29, 2018
    Inventors: Brad E. Romano, Shashi Thanikella
  • Patent number: 9922103
    Abstract: According to one embodiment, a method of copying a dataset associated with a first extract, transform, and load (ETL) job in a first data integration system to a second data integration system comprises copying executable code associated with the first ETL job from the first to the second system. Operating system software, integration system software, and file system structure are substantially identical between the first and second systems. The method further comprises executing the second ETL job to read the dataset from the first data integration system and write the dataset to the second data integration system. The second ETL job is associated with configuration parameters specifying storage resources in the first system associated with the dataset and destination parameters specifying storage resources in the second system. The method further comprises copying metadata generated by the second ETL job from the first to the second data integration system.
    Type: Grant
    Filed: October 21, 2014
    Date of Patent: March 20, 2018
    Assignee: Bank of America Corporation
    Inventors: Jason E. Martens, Brad E. Romano, Sachin M. Nerurkar, Shashi Tanikella
  • Publication number: 20160246809
    Abstract: According to one embodiment, a system for converting data integration system (“DIS”) files comprises a memory operable to store data associated with at least one DIS and a processor communicatively coupled to the memory and operable to convert files associated with a first DIS to files associated with a second DIS. The operating system used by the first DIS is different from the operating system used by the second DIS. The processor converts the files by being operable to determine differences between the first DIS and the second DIS, determine a set of transformation rules based on the differences, create a conversion algorithm based on the set of transformation rules, and execute the conversion algorithm to convert the files. The system is further operable to execute the second DIS such that the second DIS uses the converted data integration files.
    Type: Application
    Filed: February 19, 2015
    Publication date: August 25, 2016
    Inventors: Brad E. Romano, John Abraham
  • Publication number: 20160110435
    Abstract: According to one embodiment, a method of copying a dataset associated with a first extract, transform, and load (ETL) job in a first data integration system to a second data integration system comprises copying executable code associated with the first ETL job from the first to the second system. Operating system software, integration system software, and file system structure are substantially identical between the first and second systems. The method further comprises executing the second ETL job to read the dataset from the first data integration system and write the dataset to the second data integration system. The second ETL job is associated with configuration parameters specifying storage resources in the first system associated with the dataset and destination parameters specifying storage resources in the second system. The method further comprises copying metadata generated by the second ETL job from the first to the second data integration system.
    Type: Application
    Filed: October 21, 2014
    Publication date: April 21, 2016
    Inventors: Jason E. Martens, Brad E. Romano, Sachin M. Nerurkar, Shashi Tanikella