Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi-GPU support #6

Open
valhassan opened this issue Aug 8, 2024 · 2 comments
Open

Multi-GPU support #6

valhassan opened this issue Aug 8, 2024 · 2 comments

Comments

@valhassan
Copy link
Collaborator

Currently, geo-inference only supports the use of a single GPU. I want to support the use of multiple GPUs to increase inference speed.

@MarjanAsgari
Copy link
Contributor

MarjanAsgari commented Aug 28, 2024

Multi-GPU issue

The current issue for support of multi-GPU is the fact that model never gets mapped to any other available devices and always it is mapped to cuda:0. It might be something coded in model that if cuda is available always map to cuda:0, so the setting of the model should get changed to support multi-gpu.

How to check:
1- Download https://github.com/MarjanAsgari/geo-inference-dask/blob/local_run/geo_inference/test_model_device.py to your device.
2- Have a HPC node with multiple gpu available.
3- Run the test_model_device.py.

This script is trying to map the model to cuda:1 but what you should see is

image

@valhassan @jfbourgon @mpelchat04

@jfbourgon
Copy link
Collaborator

I created an issue related to this on the Model repository

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants