Performance Tests

  • Updated

A performance test in irVerify Test Suites will run a rule application for several iterations, collecting performance results from each run. Please note that this is not a load tester, but rather a way to estimate how changes to a rule application or to execution options may affect the running time or other performance aspects of your rule application.

 

ri_402.png

Performance Test Options

  • Root context - Represents the entity to test from the rule application schema.
  • Execution type - Select apply rules to run all automatic rules for the rule application. Select Entity rule set to specify an explicit rule set from that entity.
  • Entity rule set - Select the rule set to use when execution type is "Entity rule set."
  • Input data state - Select a desired test state from the list. For detailed information on data states, see Data State in Test Suites.
  • Number of iterations - Specify the number of times to execute the rule application.
  • Perform warm-up - Check this box to run two iterations before beginning the test. This will eliminate .NET overhead when the rule engine starts up.
  • Execution Options - Enables various logging and tracing options to simulate the overhead of various logging levels. While no actual log will be available through the performance test, you can still configure logging to the Windows event log using settings in the InRule configuration file. See "InRule Logging Config File Settings" in the irSDK help.
     
    If you specify Summary Statistics, Detail Statistics, or both, the InRule performance report will be available for the first 100 iterations.

    For details of the various log and trace settings, see "Rule Engine Execution Log" in the irSDK help.

Test Results

  • Iteration - The number of the iteration.
  • Stopwatch time - Actual runtime of the iteration, as measured by a stopwatch.
  • Total running time - The total running time of the rule application, as measured by the InRule engine.
  • Rule execution time - The time spend executing rules, as measured by the InRule engine.
  • Metadata compile time - The time required to compile metadata, as measured by the InRule engine.
  • Function compile time - The time needed to compile functions, as measured by the InRule engine.
  • Load XML time - The time required to load XML data into the rule application, as measured by the InRule engine.
  • Load TestScenario time (not shown) - The time required to load RuleSession state data into the rule application, as measured by the InRule engine.
  • Performance report (not shown) - Click the link to show the performance report for the iteration. Only the first hundred iterations will have this link.

All times are reported in milliseconds.

NOTE: Not all columns in the test results above will always show. Any information that requires summary or detail statistics will only show if the corresponding execution option has been selected.

 

After the performance test completes, two buttons in the test runner ribbon may be useful. "Save to Excel" will prompt the user for a Microsoft Excel file name, then save the performance results to that file. "Copy to Clipboard" will copy the performance results to the Windows clipboard, from where you may paste into Microsoft Excel or another location for further analysis.

Was this article helpful?

0 out of 0 found this helpful

Comments

0 comments

Please sign in to leave a comment.