-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Don't share caches between different jobs by default #22
Comments
I actually wonder if it would be best to fully remove cache fallbacks (if it's not possible to prune the Needs some testing and consideration of the tradeoffs to decide, though. |
Hmm, definitely needs some testing. Compiling without a cache is definitely taking a long time, but maybe it's worth the saved storage space? |
For some anecdotal data: My jam game just had a failed CI due to the runner running out of storage space. I checked and my caches were all ~3.2GB each. I deleted the saved caches and re-ran the same CI workflow, and the new caches were ~400MB, 400MB, and 1.2GB. |
hello ! |
I'm not sure I understand. Could you please elaborate what behavior you want? (Perhaps create a separate issue?) |
oh my bad, after testing further to recreate the issue, it works now. i think clearing the cached fixed it 🤷 |
Consider this scenario:
You have one job which runs
cargo test
and one which runscargo check
.These jobs will need separate compilations, as they run with different profiles.
Right now it can happen that they share a cache: Maybe
cargo check
runs first and thencargo test
cannot find a cache, but _falls backto the cache generated by
cargo check`.This can bloat the cache size, as now the
target
folder is filled with both.This problem can accumulate as the cache is never cleaned up.
A quick solution would be to remove the last cache key fallback which does not depend on the job ID. Then, the
cargo check
andcargo test
jobs would never share a cache.This issue has been discovered via TheBevyFlock/bevy_new_2d#212
The text was updated successfully, but these errors were encountered: