-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Demonstrate SPS Scalability #213
Comments
This goal could potentially be combined with the initiators to demonstrate scalability when triggering EDRgen/RDRgen for SRL. |
features/sps/scale.feature: Feature: Scaled SPS Testing
Test the SPS by scaling to meet various workloads. This should be run ocassionally, and not a part of nightly tests.
@develop @test @scale
Scenario : The SPS shall be able to run a single day of ASIPS data within X hours
Given a listing of 1 day of ASIPS input data
When I request 12 runs of asips_workflow
Then all workflows are submitted successfully
And 12 nodes have spun up to process SBG Workflows #(or we can make this an env variable to see what the maximum umber of jobs processed at once can be)
And all workflows successfully complete
#And the workflow data shows up in the data catalog # -- removed, not a part of SPS Scalability scope
And the total test time was less than X hours
@develop @test @scale
Scenario : The SPS shall be able to run a single day of SBG Preprocess data within X hours
Given a listing of 1 day of SBG input data
When I request X runs of SBG_preprocess_workflow
Then all workflows are submitted successfully
And X nodes have spun up to process SBG Workflows #(or we can make this an env variable to see what the maximum umber of jobs processed at once can be)
And all workflows successfully complete
# And the workflow data shows up in the data catalog # -- removed, not a part of SPS Scalability scope
And the total test time was less than X hours this could be run by using behave features/sps/scale.feature -n "The SPS shall be able to run a single day of ASIPS data within X hours" #runs ASIPS test
behave features/sps/scale.feature -n "The SPS shall be able to run a single day of SBG Preprocess data within X hours" #runs SBG Preprocess test... The code that implements the test would:
This is very similar code to https://github.com/unity-sds/unity-monorepo/blob/main/tests/system-tests/features/sps/cwl.feature and the implementations will be very similar, it will simply be different mechanisms to submit the jobs and what workflows to use. the workflow can be defined in the code itself or in an environment variable. Alternatively we could specify the workflow in the step itself, and then the code is always tied to that particular workflow, until we change the test itself- not a terrible idea so we know what we're testing (a link to dockstore, github, etc) |
System Validation Criteria: |
@LucaCinquini please list the SPS board work tickets in this ticket. Running EMIT jobs in parallel doubles the run-time, possibly due to the large docker container. |
SPS has successfully demonstrated scalability for these 2 use cases: This ticket could be either closed now, or moved to the next PI for further scalability testing. |
Examples of Scalability:
Risks to scaling:
The text was updated successfully, but these errors were encountered: