We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
If my device has more than two Gpus and I want to specify software gerbil to run on one GPU, how do I set the parameters?
The text was updated successfully, but these errors were encountered:
Gerbil will automatically use all CUDA capable GPUs it detects on the system. It has no option to specify which GPUs to use.
However, you should be able to hide GPUs from gerbil by specifying the CUDA_VISIBLE_DEVICES environment variable. Look here: https://stackoverflow.com/questions/39649102/how-do-i-select-which-gpu-to-run-a-job-on
Sorry, something went wrong.
Thank you for your advice, very applicable.
No branches or pull requests
If my device has more than two Gpus and I want to specify software gerbil to run on one GPU, how do I set the parameters?
The text was updated successfully, but these errors were encountered: