Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: Attempting to unscale FP16 gradients when fully finetunning on LLM without LoRA? #200

Open
dragen1860 opened this issue Jul 3, 2024 · 2 comments

Comments

@dragen1860
Copy link

dear all:
when i disable the lora and set whole Mistral LLm weights requires_grad=True, it gives me the error:

ValueError: Attempting to unscale FP16 gradients

anyone give some tips?

@Qnancy
Copy link

Qnancy commented Aug 20, 2024

I meet the same problem

@Qnancy
Copy link

Qnancy commented Aug 20, 2024

have you solved it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants