Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Permit setting default promt format and max tokens in models tab #50

Open
dagbdagb opened this issue Apr 24, 2024 · 0 comments
Open

Permit setting default promt format and max tokens in models tab #50

dagbdagb opened this issue Apr 24, 2024 · 0 comments

Comments

@dagbdagb
Copy link

dagbdagb commented Apr 24, 2024

When loading a model, permit setting the default 'Prompt Format' and 'Max Tokens for the model in question', so we don't have to set these for every new session. Is it possible to auto-detect or use some kind of heuristics to find the best template?

I notice Llama 3 instruct doesn't work as well with the default 'Chat RP' prompt. Every time. :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant