Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CUDA out of memory on 3090 #89

Open
FeiXie8 opened this issue Sep 18, 2022 · 4 comments
Open

CUDA out of memory on 3090 #89

FeiXie8 opened this issue Sep 18, 2022 · 4 comments

Comments

@FeiXie8
Copy link

FeiXie8 commented Sep 18, 2022

when i export lables on coco2014 and set the resolution to 480*640,CUDA out of memory.
my gpu is rtx3090,and the memory is 24GB

@ghost
Copy link

ghost commented Sep 30, 2022

Hi, @FeiXie8 .
Maybe, i think the batchsize is too many when you train the model.
Probably, batchsize will be 8 better.

@20181313zhang
Copy link

when i export lables on coco2014 and set the resolution to 480*640,CUDA out of memory. my gpu is rtx3090,and the memory is 24GB

hello,are you solve this question?I meet the same problem.

@20181313zhang
Copy link

Hi, @FeiXie8 . Maybe, i think the batchsize is too many when you train the model. Probably, batchsize will be 8 better.

where can i change batchsize?

@qingyunwudaoletu
Copy link

export.py not batch size reason.

Maybe you need set num_workers = 0, and pin_memory = False

workers_test = training_params.get('workers_test', 0) # 16
test_loader = torch.utils.data.DataLoader(
test_set, batch_size=1, shuffle=False,
#pin_memory=True,
num_workers=workers_test,
#worker_init_fn=worker_init_fn

in loader.py line 138

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants