Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

load ltx loras trained with finetrainers #6174

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

neph1
Copy link

@neph1 neph1 commented Dec 23, 2024

Hi
I've been playing around with finetuning LTX-Video using finetrainers, and by default the loras won't load in ComfyUI. I noticed the naming of the keys were slightly different, having a "transformer." prepended.

transformer.transformer_blocks.0.attn1.to_k.lora_A.weight

This PR makes them recognized by ComfyUI, but I have no idea whether it's a viable solution, if it applies to other loras than these, or if it should be fixed elsewhere.

@comfyanonymous
Copy link
Owner

This isn't the correct way of doing this.

The correct way is adding an entry like: https://github.com/comfyanonymous/ComfyUI/blob/master/comfy/lora.py#L384

The even more correct way is to convert loras to the ComfyUI format which is the format used when you use the lora save node.

@neph1 neph1 marked this pull request as draft December 23, 2024 08:34
@neph1
Copy link
Author

neph1 commented Dec 23, 2024

Currently this doesn't work. I'll keep working on it, but don't have time right now.

@neph1
Copy link
Author

neph1 commented Dec 25, 2024

This seems to do the trick. Although, again, I won't argue that this is the right way of doing it, or if all LTX loras look like this. Just close it if it's not admissible.

@neph1 neph1 marked this pull request as ready for review December 25, 2024 17:10
@linesword
Copy link

@neph1 I have the same problem. I want to adapt the 0.9.1 model to the lora trained by finetrainers, but it seems that it is not possible at present? How do you deal with it now?

@neph1
Copy link
Author

neph1 commented Dec 29, 2024

Do you mean you have trained with 0.9.1? They have only released the bf16 version of that, right? It's a whole different model, so any loras will need to be trained from scratch.

@linesword
Copy link

Do you mean you have trained with 0.9.1? They have only released the bf16 version of that, right? It's a whole different model, so any loras will need to be trained from scratch.

I train based it on a-r-r-o-w/LTX-Video-diffusers , but I'm not sure if this lora is suitable for the 0.9.1 model

@neph1
Copy link
Author

neph1 commented Dec 29, 2024

I don't think so. They need to release the relevant transformers model, I believe.

@linesword
Copy link

Hi, diffusers can be mixed and loaded, but comfyui cannot be loaded, how do you run it in comfyui? I seem to find that even if you add the modified code, the trained lora cannot run? @neph1

@neph1
Copy link
Author

neph1 commented Jan 2, 2025

I have noticed as well that it doesn't seem to work in latter versions.
My 'stable' version of ComfyUI is at 4e14032. Here it works fine to load loras. But I downloaded a second version to work on this PR, 4b5bcd8, and here it doesn't seem to work very well.

@neph1
Copy link
Author

neph1 commented Jan 4, 2025

@linesword fwiw, I've uploaded my script that renames the keys so that the lora is recognized by comfyui here: https://github.com/neph1/finetrainers-ui/tree/main/scripts

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants