How to operate pipelines with Dagster deployed on Kubernetes? #3851
Replies: 3 comments 7 replies
-
Hi @adammrozik, Thanks for reaching out with these questions! A) We do not currently support adding a user code deployment automatically without modifying values.yaml but we have heard this request from other users and are considering it for a future release. B) You can launch pipelines from other services over graphql without needing to interact with the dagit ui-- example here: https://docs.dagster.io/examples/trigger_pipeline Let me know if this works for you |
Beta Was this translation helpful? Give feedback.
-
Hi Catherine,
I would say we want to be adding/updating user code deployments around 50
times a day.
For now I am just deploying a new Deployment and Service, adjusting config
map with workspace.yaml and doing a rolling restart of dagit. However, I am
afraid that 50 restarts per day would cause issues like running jobs
getting stopped, or at least their logs getting lost.
Do you think dagster is the correct tool for our needs ?
pon., 22 mar 2021, 18:15 użytkownik Catherine Wu ***@***.***>
napisał:
… How frequently do you want to add new user code deployments? In the case
where adding a new user code deployment is a rare event, we recommend
adding the user code deployment to the workspace.yaml and doing a re-deploy
of dagit + dagster components.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#3851 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ARPF4NSOU5EVDLXZ6FU6YJDTE53MTANCNFSM4ZFFVQYA>
.
|
Beta Was this translation helpful? Give feedback.
-
Hello, As a work around with Docker-compose, I create a pipeline that checks the change (scheduler here) & reload appropriate service. https://github.com/slamer59/dagster-central/blob/main/workspaces/update_git_repo/repo.py It's not really clean but it works... It use docker library inside docker instance to control docker running all the service: |
Beta Was this translation helpful? Give feedback.
-
Hey!
I am looking for ideas of operating multiple User Code Deployments with Dagster installed by Helm.
I have currently 2 issues:
A) CI/CD of adding other user Deployments:
I see that I can just add another section in
userDeployments
ofvalues.yaml
whenever I need another userDeployment. However, I would like to do it more automatically with some form of CI/CD (e.g. whenever a new image is released, it will create a new User Code Deployment on K8S). Right now the only way I see of doing it is by templating Helm chart and runninghelm upgrade
every time a new image is released or by looking into Helm Template and creating these objects myself everytime. Or is there a simpler method to prepare my own User Code Deployments?B) Running Pipelines prepared by User Deployments via API
I would like to prepare a service where user can just call
run_pipeline(name="abc", params = ...)
and it will automatically launch one of the pipelines that dagit was able to gather from UserDeployments without a need to interact with UI. Are there any methods out there that I could use? Ideally it would be something like executing a pipeline while giving dagit address and a pipeline name.I would be grateful for any kind of help! In my team we think Dagster looks great, we are just trying to make a Proof Of Concept of how to operate it in a production environment and for that I need to prototype a fully working solution
Beta Was this translation helpful? Give feedback.
All reactions