The ReportStream Integration Test is a framework meant to add test coverage for the integration between the Intermediary and ReportStream. It's scheduled to run daily using the automated-staging-test-run.yml workflow
Information on how to set up the sample files evaluated by the tests can be found here
- The output files generated by the framework are stored in an Azure blob storage container. Every time the tests are run, a cleanup task moves the files to a folder with the year/month/day format for better organization. The files are retained in the container for 90 days before being deleted
- The code that organizes the files is using EST time zone. This means that if you are in PST, you may run into an issue if you submit a run before 9 PM PST and then run the tests after 9pm. You'd need to make sure to run both tasks before or after 9pm so the files are where they are expected to be
- The HL7 parser and expression evaluator returns an empty string when the value is not found. It will only throw an exception if the path is not a valid HL7 notation
There are two scheduled tasks that run every weekday around midnight EST:
- Automated Staging Test - Submit Messages submits the messages in
/examples/Test/Automated
- Automated Staging Test - Run integration tests triggers a couple of hours later and runs the Automated Tests on the input files in
/examples/Test/Automated
and the output files in theautomated
container for thecdctiautomatedstg
Azure storage account.
When running locally, we usually run the tests from either the command line using gradle, or from IntelliJ. Please note that even though you are running the test from your local machine, the test will need to connect to the Azure container to pull the output files to apply the assertions on. For this reason, you will need to set the AZURE_STORAGE_CONNECTION_STRING
environment variable to authenticate the connection to Azure. You can find the value for AZURE_STORAGE_CONNECTION_STRING
in Keybase (keybase://team/cdc_ti/service_keys/TI/staging/azure-storage-connection-string-for-automated-rs-e2e-tests.txt).
From the command line:
- Set the
AZURE_STORAGE_CONNECTION_STRING
environment variable in your shell - Run:
./gradlew rs-e2e:clean rs-e2e:automatedTest
From IntelliJ:
- Set
AZURE_STORAGE_CONNECTION_STRING
environment variable in the IntelliJ test run configuration (instructions on how to do that here) - Go to
rs-e2e/src/test/groovy/gov/hhs/cdc/trustedintermediary/rse2e/AutomatedTest.groovy
and either run or debugAutomatedTest
as you normally would from IntelliJ
- Run the Automated Staging Test - Submit Messages action
- Wait for RS and TI to finish processing files
- Run the Automated Staging Test - Run integration tests action
If you have added new files to /examples/Test/Automated
in your branch/PR, before running the Automated Staging Test - Submit Messages
action in step 1, select the branch you're working on as shown in the screenshot below. This will make sure to include your new file(s) when submitting the messages.
If you have added new assertion rules to the assertion_definitions.json file, you should do the same for step 3 and select your branch when running the Automated Staging Test - Run integration tests
action.
Instead of running the Run integration tests
action, you could also test it from your local machine by following the steps in the previous section.
Note: when testing a branch with new assertions, it's recommended to make sure the assertions fail as a gut check.
The assertions for the integration tests are defined in the assertion_definitions.json file, which uses the same rules engine framework as the transformations and validations in the etor project
The file contains a list of definitions which each contain:
name
: a descriptive name for the assertions groupconditions
: a list of conditions to be met. These determine whether this set of assertions apply to the file being evaluated. When no conditions are included, the definition applies to all files. If conditions are included, all of them must be satisfied for the definition to apply. Conditions are structured the same way as rulesrules
: a list of assertions to evaluate
The rules are the assertions for the integration test. The assertions are HL7 expressions inspired
by FHIRPath
. The current assertions we allow are: equality, non-equality, membership, and
segment count. We can evaluate strings and/or values in HL7 fields. An HL7 field in a rule
can be in either the input file or the output file. If no file is specified, we assume it's the output.
Each rule is contained in double quotes and any string values are contained in single quotes
Examples:
- Equality between an HL7 field in the output and input
"MSH-10 = input.MSH-10"
-MSH-10
has the same value in the output and input files"output.MSH-10 = input.MSH-10"
- same as above
- Equality between an HL7 field and a string
"MSH-4 = 'CDPH'"
- the value ofMSH-4
in the output file equalsCDPH
"MSH-4 != ''"
- the value ofMSH-4
in the output file doesn't equal an empty string
- Membership
"MSH-6 in ('R797', 'R508')"
- the value ofMSH-6
in the output file is eitherR797
orR508
- Segment count
"OBR.count() = 1"
- there is only oneOBR
segment in the file