You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This feature is to provide the capability to "deploy" an OGC application package to Airflow. Currently, we use a singular DAG (either cwl_dag.py or cwl_modular.py) to run all packages.
By calling a deploy OGC endpoint with a valid request (CWL?), the following steps should be taken:
The CWL file is loaded within the OGC API
The docker container specified within the App Package is pulled and registered locally within the ADES (e.g. in an ECR or S3 backed store)
The CWL file is updated with the newly (locally) registered container.
The CWL file is stored in a persistent store (s3, database) and made available to the ADES / Pods
(value add steps follow)
A new DAG is created to call the k8s operator with the specific, updated CWL (step 4).
The input fields are created for the DAG by parsing the inputs to the CWL
A user can execute the CWL by using the Airflow UI and providing specific inputs in individualized fields.
The text was updated successfully, but these errors were encountered:
Thanks Mike. In my view steps 6–8 are MVP. The reason is that it is very difficult to separate the tracking of jobs by algorithm if we don't split up the associated Airflow DAGs.
In an operational scenario I imagine the single-DAG would be a huge barrier to acceptance because it impacts anomaly resolution. Unless I'm missing something.
App package deploy and docker localization
This feature is to provide the capability to "deploy" an OGC application package to Airflow. Currently, we use a singular DAG (either cwl_dag.py or cwl_modular.py) to run all packages.
By calling a deploy OGC endpoint with a valid request (CWL?), the following steps should be taken:
The text was updated successfully, but these errors were encountered: