Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

用两个A100训练,只能达到65 Training with two A100s can only reach 65 #66

Open
xun9167 opened this issue Mar 27, 2024 · 1 comment

Comments

@xun9167
Copy link

xun9167 commented Mar 27, 2024

我在两个40G的A100上训练,最后的指标HOTA只有65,请问是因为训练时GPU数量不足的原因吗?
I trained on two 40G A100s, and the final HOTA indicator was only 65. May I ask if this is due to the insufficient number of GPUs during training?

@HELLORPG
Copy link

HELLORPG commented Apr 9, 2024

原文采用的 batch size 应该是 8,如果使用两张 GPU 的话应该是无法得到同样的 batch size 的,这确实会对最终的结果产生影响。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants