Scaling identical pipelines, different parameters #10414
Unanswered
JSteinbauer
asked this question in
Q&A
Replies: 1 comment 3 replies
-
Hi @JSteinbauer - the first way that I would think to accomplish this would be with partitions. Have you looked at those? https://docs.dagster.io/concepts/partitions-schedules-sensors/partitions |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi there,
I'd like to know the best way (in your opinion) to apply dagster to the following, probably very common problem:
We are creating customer-specific ML models. That means that, for each customer, we want to
Is it (currently) even possible to combine all of these points at the same time?
What would be the preferred way of launching individual runs (e.g. as docker containers, using sensors or GraphQL requests, etc.)? I tried sending different configurations with GraphQL requests to a single dagit instance, but asset caching didn't seem to work that way..
Is it be possible to spawn several pipeline containers and use a single dagit server to monitor all of it?
Thanks a lot for your reply,
Jakob
Beta Was this translation helpful? Give feedback.
All reactions