RULES ENGINE TEST HARNESS

A system is provided. The system comprises a computer system, a builder component, and a test execution component. When executed by the computer system, the builder component promotes defining a plurality of test cases and a plurality of test scenarios. When executed by the computer system, the text execution component simulates at least one service application, invokes execution of plurality of rules on a rules engine with an input based on one of the test cases defined using the builder component, wherein the rules engine interacts with the simulated service application, and stores the result of the rule execution in a database.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority to U.S. Provisional Application Ser. No. 61/048,810 filed Apr. 29, 2008, and entitled “MMIS Health Enterprise Solution,” by Jack Devos, et al., which is incorporated herein by reference for all purposes.

STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT

Not applicable.

REFERENCE TO A MICROFICHE APPENDIX

Not applicable.

BACKGROUND

The United States Medicaid program was enacted in 1965 to provide a medical assistance program for individuals and families with low incomes. The Medicaid program comprises three main entities—the patients, the healthcare providers, and the agency administering the plan (i.e., the payer). The Medicaid program is financed through joint federal and state funding. The Medicaid program is administered by each state according to an approved state plan. The specifics of the Medicaid program differ from state to state. Differences may include covered healthcare procedures, allowable procedure costs, and patient eligibility criteria. The state administrators of the Medicaid program are required to have a Medicaid management information system (MMIS) that provides for mechanized and/or computerized Medicaid claims processing. Recently, the Medicaid information technology architecture (MITA) has been promulgated by the U.S. government to provide a blueprint and a set of standards that individual states are to follow in administering the Medicaid program and for developing the next generation MMIS.

Rules engines may be pluggable software components that execute rules that have been externalized from application code. The rules define business rules and/or business logic that may change frequently. Typically, rules may be defined by nonprogrammers and may be provided to the rules engine in the form of data or data files. Using a rules engine to provide business rules to an application may reduce time to market and reduce total cost of ownership, with reference to the alternative of encoding the business logic in high level programming language code.

Automation designs, for example business rules to be executed by a rules engine and/or applications encoded in a high level programming language, are typically tested extensively before deployment in customer facing products and/or systems. Testing may take the form of executing a number of test cases on the target computer systems, where each test case comprises a single set of inputs to the subject rules engine and/or application. In addition to defining the inputs, the test cases may define the expected output and/or results of executing the rules engine and/or the application on the inputs. Because testing every possible combination of inputs is typically infeasible, an attempt is often made to identify a representative selection of test cases that may provide sufficient confidence in the automation design under test. Testing automation designs may be an expensive process that consumes substantial business resources including personnel, equipment, and schedule time. Insufficient testing can result in deployment of a flawed automation design that may fail inopportunely, possibly damaging the enterprise brand and/or incurring liability.

SUMMARY

In an embodiment, a system is provided. The system comprises a computer system, a builder component, and a test execution component. When executed by the computer system, the builder component promotes defining a plurality of test cases and a plurality of test scenarios. When executed by the computer system, the text execution component simulates at least one service application, invokes execution of a plurality of rules on a rules engine with an input based on one of the test cases defined using the builder component, wherein the rules engine interacts with the simulated service application, and stores the result of the rule execution in a database.

In another embodiment, a method is provided. The method comprises providing a builder interface for building a plurality of test cases and a plurality of test scenarios, the interface comprising a list of rules and a plurality of input controls for defining a plurality of inputs for the test cases. The method also comprises storing the test cases and the test scenarios in a database, invoking the rules on a rules engine based on at least one of the test cases and the test scenarios, and simulating at least one service application, wherein the rules engine interacts with the service application when executing the rules. The method also comprises collecting the results of the invocation of the rules on the rules engine and storing the results in the database.

In another embodiment, an apparatus is provided. The apparatus comprises a test case editor and a rule test manager. When executed on a first computer system, the test case editor provides an environment for building a plurality of rule test cases by selecting an entry point from a list of available entry points, by defining a plurality of inputs for the rule, and by defining expected results for one of the test cases. When executed on the first computer system, the rule test manager executes the rule test cases by simulating at least one service program, invoking a plurality of rules on a rules engine based on the rule test cases built using the test case editor, receives results from the rules engine, and stores the results, wherein the rules engine interacts with the simulated service program when executing the rules.

These and other features will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings and claims.

BRIEF DESCRIPTION OF THE DRAWINGS

For a more complete understanding of the present disclosure, reference is now made to the following brief description, taken in connection with the accompanying drawings and detailed description, wherein like reference numerals represent like parts.

FIG. 1 is a schematic diagram of a rules engine test harness according to an embodiment of the disclosure.

FIG. 2 is a flowchart of a method of testing rules according to an embodiment of the disclosure.

FIG. 3 is a schematic diagram of an exemplary general-purpose computer system suitable for implementing some aspects of the several embodiments of the disclosure.

DETAILED DESCRIPTION

It should be understood at the outset that although illustrative implementations of one or more embodiments are illustrated below, the disclosed systems and methods may be implemented using any number of techniques, whether currently known or in existence. The disclosure should in no way be limited to the illustrative implementations, drawings, and techniques illustrated below, but may be modified within the scope of the appended claims along with their full scope of equivalents.

A rules engine test harness for testing rules is disclosed. The rules engine test harness may promote testing rules without using a fully configured and/or duplicate execution environment. In an embodiment, a rules engine framework is used to create and optionally to compile rules that may be executed by a rules engine. Further details about a rules engine framework are provided in U.S. patent application Ser. No. 12/257,782 filed Oct. 24, 2008 entitled “Rules Engine Framework” by Uma Kandasamy et al., which is hereby incorporated by reference in its entirety. When deployed as a released product, the rules may be executed by a rules engine on a process server computer system in cooperation and coordination with a business process. In an embodiment, the business process and rules engine cooperatively provide at least a portion of a healthcare management information system. A user of the healthcare management information system may interact with the business process and rules engine by entering information and selecting actions from a user interface presented at a workstation communicatively coupled to the process server computer system. The business process may promote enrolling healthcare providers into the healthcare system, for example enrolling physicians, therapists, hospitals, minor emergency medical centers, and other healthcare providers. The business process may promote enrolling healthcare recipients and/or patients. The business process may promote receiving and processing claims from enrolled healthcare providers for services provided to enrolled healthcare recipients.

A function of the business process may be invoked by the user of the workstation, for example processing of a claim for healthcare services. The business process may process this claim by invoking a series of rules and/or a rule flow on the rules engine. The use of rules to provide automated processing may have the advantage of allowing for the rapid creation, modification, and deployment of rules with respect to processing based on specifically designed computer programs. The processing of the claim may involve the rules engine executing a sequence of rules that validate the enrollment of the healthcare provider, validate the enrollment of the healthcare recipient, determine a coverage status of the subject procedure, determines a maximum coverage amount, and other such processes. In an embodiment, the rules engine may return a rule result object to the business process that invoked the rules engine, wherein the rule result object may identify one or more exceptions if something went wrong, or no exceptions if the rule processed successfully.

Turning now to FIG. 1, a system 100 for rules testing is discussed. The system 100 comprises a computer system 102, a workstation 108, a network 110, and a database 112. The computer system 102 comprises a builder component 104 and a test execution component 106. The workstation 108 may execute a builder interface 126 and a test control interface 128. The database 112 may also be referred to as a test case repository, and may comprise a plurality of test cases 114, a plurality of test scenarios 116, a plurality of test results 122, and a plurality of success/fail reports 124. It is understood that when the system 100 is first placed into service, the database 112 may be substantially empty and the test cases 114, the test scenarios 116, the test results 122, and the success/fail reports 124 may accumulate in number over time. While the computer system 102 in FIG. 1 is shown to be in direct communication with the database 112, in some embodiments the computer system 102 may communicate with the database 112 via the network 110. The computer system 102, the workstation 108, and the database 112, along with their contained components, applications, and data, may be referred to in some contexts as a rules engine test harness. The network 110 may or may not be considered as a part of the rules engine test harness. The system 100 also comprises a server 118 and a rules engine 120. The test execution component 106 then invokes the server 118 to execute the test cases. The computer system 102, the workstation 108, and the server 118 may each be implemented as a general-purpose computer system, which is discussed in greater detail hereinafter.

The builder component 104 may comprise one or more components. The builder component 104 promotes defining a plurality of test cases and a plurality of test scenarios for testing rules, for example using the builder interface 126 on the workstation 108. The builder interface 126 may be a web interface that is provided by the builder component 104, for example as a hypertext mark-up language (HTML) page transmitted from the builder component 104 to the workstation 108. Likewise, the test cases and test scenarios may be stored in the database 112. As used herein, a test case 114 identifies a set of inputs, a rule, and a set of expected results from executing the rule based on the set of inputs. In an embodiment, a rule may comprise a condition portion and an action portion. When the condition evaluates TRUE when executed by the rules engine 120, the action is performed. The condition may comprise a left hand variable associated with a right hand value by a generic template. Worded differently, the condition may comprise a left hand operand associated with a right hand operand by an operator. As an example, a rule may comprise a condition that evaluates the enrollment of an individual in a healthcare management information system and an action that indicates the rule succeeded if the condition evaluates to TRUE and sets an exception code if the condition evaluated to FALSE. As used herein, a test scenario 116 comprises an ordered sequence of test cases 114. Each test scenario 116 may be used to determine the correct operability of one or more rules.

The set of inputs associated with a test case 114 may comprise a number of different input parameters and the values associated with those different input parameters. Similarly, the set of expected results may comprise an output parameter and the values associated with the output parameter. For exemplary purposes, a simple test case 114 may comprise pairs of parameter names and input values and a result as follows (member identity=1234567, age=67, procedure code=2007, provider identity=7654321, expected action=SUCCESS). An alternative simple test case 114 may comprise pairs of parameter names and input values in the form of XML and a result in XML as well (member identity=1234567, age=−2, procedure code=2007, provider identity=7654321, expected action=set exception code 1234). In some embodiments, as many as 6,000 or more rules may be deployed to be executed by a rules engine in an enterprise function, for example a portion of a healthcare management information system. A representative set of test cases 114 adequate to provide an acceptable level of confidence in the rules may exceed 100,000 test cases 114.

In an embodiment, the builder component 104 provides drop down menus for composing test cases 114, for example in a web interface provided to the workstation 108. The builder interface 126 may be a web interface that is provided by the builder component 104, for example as an HTML page transmitted from the builder component 104 to the workstation 108. The drop down menus may comprise drop down menus for selecting invocation points and/or entry points, for defining inputs, and for defining results, for example rule return object contents and/or values. In an embodiment, the builder component 104 may provide a way to begin defining a new test case 114 by first cloning or copying a previously defined test case 114. The new test case 114 is then changed in some manner, for example by specifying a different value for an input parameter and a different expected value of the rule return object. The builder component 104 also provides means for composing a test scenario 116 as an ordered sequence of test cases 114. The builder component 104 writes test cases 114 to the test cases 114 in the database 112 and writes test scenarios 116 to the test scenarios 116 in the database 112. In some embodiments, the builder component 104 writes test cases 114 and test scenarios 116 to the database 112 by using a test case data access object (DAO). In some contexts, the builder component 104 may be referred to as a test case editor.

The test execution component 106 simulates one or more service applications that the rules engine 120 interacts with when executing rules. For example, the test execution component 106 may simulate a portal server processor, a plurality of databases, a plurality of database management applications, a messaging application, and other applications and/or services. The test execution component 106 also invokes execution of rules on the rules engine 120 according to the test cases 114 and/or the test scenarios 116 stored in the database 112, for example under the control of a web interface on the workstation 108. The test control interface 128 may be a web interface that is provided by the builder component 104, for example as a HTML page transmitted from the builder component 104 to the workstation 108. The test execution component 106 may read the test cases 114 and/or test scenarios 116 from the database 112 and, based on the content of the test cases 114 and/or test scenarios 116, invoke the execution of specific rules with appropriate input parameter values specified by the test cases 114 and/or test scenarios 116.

Additionally, the test execution component 106 receives the results of the execution of the rules by the rules engine 120, for example receiving a rules return object containing SUCCESS and/or exception code values. The test execution component 106 saves the results of executing the test cases 114 and/or the test scenarios 116 in the test results 122 of the database 112. In an embodiment, the test execution component 106 compares the rule return object received from the rules engine 120 to the expected rule return object defined in the test cases 114 and generates a report based on this comparison. The test execution component 106 may compare the rule return object from the current execution of a test case 114 with the expected result that was optionally provided as input to the test case. If the expected result is not given as an input, the result will be marked as not available (N/A) since there is no data for the comparison.

The report may comprise a summary of numbers of test cases 114 that passed, numbers of test cases 114 that failed, numbers of test scenarios 116 that passed, numbers of test scenarios 116 that failed. The report may comprise trend analysis of pass/fail statistics over a plurality of testing sessions. The test execution component 106 stores the report in the success/fail report 124 in the database 112. In an embodiment, the test execution component 106 may communicate results information to the database 112 at least in part using a reporting data access object. In some contexts, the rule return object may be referred to as results and/or test results.

In an embodiment, the test execution component 106 may optionally perform timing of the execution of one or more of the test cases 114, for example test cases 114 that include an elapsed time of execution expected value. For example, a test case 114 may be directed to testing the timeliness of a response to an operator command input. The test execution component 106 may store the timing results along with other test results in the test results 122 of the database 112. In an embodiment, comparisons among timing results for executing the same or similar test cases 114 stored in the test results 122 of the database 112 may be performed to determine a trend of performance of the rules engine 120.

The network 110 may comprise any combination of private networks and public networks. The network 110 may comprise electrical or optical wired links as well as wireless links. In an embodiment, the network 110 may be confined to a single facility or campus, but in another embodiment, the network 110 may be distributed across a wide geographical area, including across the entire globe.

The rules engine 120 may include some portions of a commercial off the shelf (COTS) rules engine tool. The COTS rules engine tool may be customized and/or encapsulated by a wrapper or a rules engine framework to provide functionality and convenience otherwise not provided by the COTS rules engine tool. In an embodiment, a Fair Isaac BLAZE ADVISOR COTS rules engine package may be encapsulated in the rules engine 120. In an embodiment, the rules engine 120 may extend the COTS rules engine tool, for example, by providing automated input value validation between a left hand variable and a right hand input. The rules engine 120 also may feature other extensions of the COTS rules engine tool.

Turning now to FIG. 2, a method 200 is described. At block 204, a test builder interface is provided comprising a list of rules to be tested. The tested rules may be all of the rules or only some of the rules. The test builder interface 126 may be provided as a web interface that may execute on the workstation 108. The list of rules and/or entry points may be provided as one or more drop down menus in the test builder interface 126. At block 208, a plurality of test cases 114 are defined using the test builder interface. Defining test cases 114 may comprise identifying a specific rule, an entry point, one or more input parameter values associated with the rule, and one or more expected output parameter values associated with executing the rule based on the input parameter values. At block 212, the rule test cases 114 and test scenarios 116 defined at block 208 above are stored, e.g., in the database 112. In an embodiment, a plurality of test case designers may be concurrently defining test cases 114 and test scenarios 116 by executing the test builder interface on separate workstations 108, and may be concurrently storing the test cases 114 and the test scenarios 116 in the database 112.

At block 216, the rules are invoked on the rules engine 120 based on the test cases 114 and based on the test scenarios 116. A single invocation point may be invoked at one time, providing appropriate input parameter values based on the test case 114. Alternatively, a plurality of invocation points may be serially invoked by a test scenario 116, thereby providing appropriate input parameter values based on the test cases 114 composing the test scenario 116. The rules engine 120 may access the rule and/or rules out of a rule repository database (not shown), or the rule definition may be provided along with the input parameter values. In an embodiment, the rules are precompiled and/or preprocessed into a binary file format that is stored on the server 118, for example in Fair Isaac binary (ADB) file format.

At block 220, one or more service applications are simulated by the test execution component 106. The rules engine 120 may interact with the service applications during the course of executing the rule under test. The service applications may comprise a portal server processor, a database management system, a messaging service, and other services. At block 224, the elapsed time of execution of the rule processing by the rules engine 120 is optionally timed by the test execution component 106. At block 228, the results of executing the rule and/or rules by the rules engine 120, for example output parameter values and optionally elapsed time information is collected. At block 232, the results of executing the rule and/or rules are stored in the database 112. The results may also be displayed by an interface, for example a web interface that may execute on the workstation 108.

At block 236, a success/fail report is optionally generated based on the results of executing the rule and/or rules. The expected output parameter values defined in the test cases 114 are compared to the results to determine whether the subject test cases 114 succeeded or failed. Summary statistics, for example total numbers of successes and failures, may be aggregated and placed in the success/fail report 124. The success/fail report 124 may also include information comparing the current test session results with earlier test results to report a test success trend. The success/fail report 124 may be stored in the database 112 and may be displayed by an interface, for example a web interface that may execute on the workstation 108. The process 200 then exits.

Some aspects of the system described above, for example the computer system 102, the workstation 108, and the server 118, may be implemented on any general-purpose computer with sufficient processing power, memory resources, and network throughput capability to handle the necessary workload placed upon it. FIG. 3 illustrates a typical, general-purpose computer system suitable for implementing one or more embodiments disclosed herein. The computer system 380 includes a processor 382 (which may be referred to as a central processor unit or CPU) that is in communication with memory devices including secondary storage 384, read only memory (ROM) 386, random access memory (RAM) 388, input/output (I/O) devices 390, and network connectivity devices 392. The processor 382 may be implemented as one or more CPU chips.

The secondary storage 384 is typically comprised of one or more disk drives or tape drives and is used for non-volatile storage of data and as an over-flow data storage device if RAM 388 is not large enough to hold all working data. Secondary storage 384 may be used to store programs that are loaded into RAM 388 when such programs are selected for execution. The ROM 386 is used to store instructions and perhaps data that are read during program execution. ROM 386 is a non-volatile memory device, which typically has a small memory capacity relative to the larger memory capacity of secondary storage 384. The RAM 388 is used to store volatile data and perhaps to store instructions. Access to both ROM 386 and RAM 388 is typically faster than to secondary storage 384.

I/O devices 390 may include printers, video monitors, liquid crystal displays (LCDs), touch screen displays, keyboards, keypads, switches, dials, mice, track balls, voice recognizers, card readers, paper tape readers, or other well-known input devices.

The network connectivity devices 392 may take the form of modems, modem banks, Ethernet cards, universal serial bus (USB) interface cards, serial interfaces, token ring cards, fiber distributed data interface (FDDI) cards, wireless local area network (WLAN) cards, radio transceiver cards such as code division multiple access (CDMA), global system for mobile communications (GSM), and/or worldwide interoperability for microwave access (WiMAX) radio transceiver cards, and other well-known network devices. These network connectivity devices 392 may enable the processor 382 to communicate with an Internet or one or more intranets. With such a network connection, it is contemplated that the processor 382 might receive information from the network, or might output information to the network in the course of performing the above-described method steps. Such information, which is often represented as a sequence of instructions to be executed using processor 382, may be received from and outputted to the network, for example, in the form of a computer data signal embodied in a carrier wave.

Such information, which may include data or instructions to be executed using processor 382 for example, may be received from and outputted to the network, for example, in the form of a computer data baseband signal or signal embodied in a carrier wave. The baseband signal or signal embodied in the carrier wave generated by the network connectivity devices 392 may propagate in or on the surface of electrical conductors, in coaxial cables, in waveguides, in optical media, for example optical fiber, or in the air or free space. The information contained in the baseband signal or signal embedded in the carrier wave may be ordered according to different sequences, as may be desirable for either processing or generating the information or transmitting or receiving the information. The baseband signal or signal embedded in the carrier wave, or other types of signals currently used or hereafter developed, referred to herein as the transmission medium, may be generated according to several methods well known to one skilled in the art.

The processor 382 executes instructions, codes, computer programs, scripts which it accesses from hard disk, floppy disk, optical disk (these various disk based systems may all be considered secondary storage 384), ROM 386, RAM 388, or the network connectivity devices 392. While only one processor 382 is shown, multiple processors may be present. Thus, while instructions may be discussed as executed by a processor 382, the instructions may be executed simultaneously, serially, or otherwise executed by one or multiple processors 382.

While several embodiments have been provided in the present disclosure, it should be understood that the disclosed systems and methods may be embodied in many other specific forms without departing from the spirit or scope of the present disclosure. The present examples are to be considered as illustrative and not restrictive, and the intention is not to be limited to the details given herein. For example, the various elements or components may be combined or integrated in another system or certain features may be omitted or not implemented.

In addition, techniques, systems, subsystems, and methods described and illustrated in the various embodiments as discrete or separate may be combined or integrated with other systems, modules, techniques, or methods without departing from the scope of the present disclosure. Other items shown or discussed as directly coupled or communicating with each other may be indirectly coupled or communicating through some interface, device, or intermediate component, whether electrically, mechanically, or otherwise. Other examples of changes, substitutions, and alterations are ascertainable by one skilled in the art and could be made without departing from the spirit and scope disclosed herein.

Claims

1. A system, comprising:

a computer system;
a builder component that, when executed by the computer system, promotes defining a plurality of test cases and a plurality of test scenarios; and
a test execution component that, when executed by the computer system, simulates at least one service application, invokes execution of a plurality of rules on a rules engine with an input based on one of the test cases defined using the builder component, wherein the rules engine interacts with the simulated service application, and stores the result of the rule execution in a results file.

2. The system of claim 1, wherein the test execution component further times execution of rules, rule sets, and rule flows and stores the execution time results and the result of the rule execution in the results file.

3. The system of claim 1, wherein defining test cases and test scenarios further comprises defining expected results of the test cases and the test scenarios.

4. The system of claim 1, wherein test scenarios are defined using the builder component as an ordered sequence of at least two test cases.

5. The system of claim 1, wherein the service application comprises at least one of a data base management system, a message service, and a portal server.

6. The system of claim 1, further comprising a database, wherein the builder component stores test cases and test scenarios in the database and the test execution component stores the results file in the database.

7. A method, comprising:

providing a builder interface for building a plurality of test cases and a plurality of test scenarios, the interface comprising a list of rules and a plurality of input controls for defining a plurality of inputs for the test cases;
storing the test cases and the test scenarios in a database;
invoking the rules on a rules engine based on at least one of the test cases and the test scenarios;
simulating at least one service application, wherein the rules engine interacts with the service application when executing the rules;
collecting the results of the invocation of the rules on the rules engine; and
storing the results in the database.

8. The method of testing rules of claim 7, further comprising timing the completion of the rules, wherein collecting the results of the invocation of the rules comprises collecting the timing results.

9. The method of testing rules of claim 7, further comprising comparing the results of the invocation of the rules on the rules engine with results of a previous invocation of the rules on the rules engine stored in the database.

10. The method of testing rules of claim 7, wherein the builder interface generates extensible markup language files based on user inputs defining test cases and test scenarios and wherein storing the test cases and test scenarios in the database comprises storing the extensible markup language files in the database.

11. The method of testing rules of claim 7, wherein the builder interface further comprises input controls for defining expected results of test cases.

12. The method of testing rules of claim 7, further comprising generating a success and fail report and storing the success and fail report in the database, wherein revising is further based on the success and fail report.

13. The method of claim 12, further comprising timing the completion of the rules, wherein collecting the results of the invocation of the rules comprises collecting the timing results and revising is further based on the timing results.

14. An apparatus, comprising:

a test case editor, executable on a first computer system, that provides an environment for building a plurality of rule test cases by selecting an entry point from a list of available entry points, by defining a plurality of inputs for the rule, and by defining expected results for one of the test cases; and
a rule test manager, executable on the first computer system, that executes the rule test cases by simulating at least one service program, invoking a plurality of rules on a rules engine based on the rule test cases built using the test case editor, receives results from the rules engine, and stores the results, wherein the rules engine interacts with the simulated service program when executing the rules.

15. The apparatus of claim 14, wherein the rules engine executes on a second computer system and the rules test manager invokes the rules on the rules engine using JAVA remote method invocation.

16. The apparatus of claim 14, wherein the test case editor comprises a web interface.

17. The apparatus of claim 14, further comprising a database wherein the test case editor stores test cases in the database and the rule test manager stores the results in the database.

18. The apparatus of claim 17, wherein the test case editor communicates with the database using a test case data access object and the rule test manager communicates with the database using a reporting data access object.

19. The apparatus of claim 14, wherein the at least one service program is one of a database management system and a message system.

20. The apparatus of claim 14, wherein the rule test cases comprise a plurality of healthcare claims.

Patent History
Publication number: 20090271351
Type: Application
Filed: Nov 26, 2008
Publication Date: Oct 29, 2009
Applicant: Affiliated Computer Services, Inc. (Dallas, TX)
Inventors: Uma Maheswari Kandasamy (Atlanta, GA), Krishnam Raju B. Datla (Atlanta, GA)
Application Number: 12/323,707
Classifications
Current U.S. Class: Ruled-based Reasoning System (706/47)
International Classification: G06N 5/00 (20060101);