Job Concurrency - When running, block all others. How? #20061
-
All my jobs have a unique tag tag_concurrency_limits:
- key: single_run_per_job
value:
applyLimitPerUniqueValue: True
limit: 1 What I would like to achieve is having a job that if it is running, no other job can be running as well. For example, I would like to have a job that performs optimize and vacuum to my tables, and I don't any other job running at the same time. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 5 replies
-
Hi @dinis-rodrigues, unfortunately this sort of concurrency limit is not supported in a convenient way at the moment -- all tag keys are treated equivalently, and so assuming there's only a single tag key (e.g. single_run_per_job), there's no way to have an instance of one specific job prevent all other jobs from running while still allowing any other combination of jobs to proceed in parallel. Any solution here would need to have one distinct tag key per job, e.g. something like each job having |
Beta Was this translation helpful? Give feedback.
-
This is very handy. I have some job I would like to stall all others too. Is there a way for this already? |
Beta Was this translation helpful? Give feedback.
Hi @dinis-rodrigues, unfortunately this sort of concurrency limit is not supported in a convenient way at the moment -- all tag keys are treated equivalently, and so assuming there's only a single tag key (e.g. single_run_per_job), there's no way to have an instance of one specific job prevent all other jobs from running while still allowing any other combination of jobs to proceed in parallel.
Any solution here would need to have one distinct tag key per job, e.g. something like each job having
{f"{job_name}_concurrency_tag": True}
, and then the vacuum job would need to have all of those tags. This has obvious drawbacks, but is likely the only workaround at the moment.