How do you do LoRA on torchtune with only lora applied to MLP? #2870
-
For llama-3.1-8b LORA, I want to do the equivalent of:
but I get an error like
Why is this? On HF PEFT, I'm able to do LoRA on just the |
Beta Was this translation helpful? Give feedback.
Answered by
Sinestro38
Jul 17, 2025
Replies: 2 comments 4 replies
-
@pbontrager perhaps you're aware? Thanks! |
Beta Was this translation helpful? Give feedback.
4 replies
-
Okay this has been merged following #2875 |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
Sinestro38
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Okay this has been merged following #2875