You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When loading a model, permit setting the default 'Prompt Format' and 'Max Tokens for the model in question', so we don't have to set these for every new session. Is it possible to auto-detect or use some kind of heuristics to find the best template?
I notice Llama 3 instruct doesn't work as well with the default 'Chat RP' prompt. Every time. :-)
The text was updated successfully, but these errors were encountered:
When loading a model, permit setting the default 'Prompt Format' and 'Max Tokens for the model in question', so we don't have to set these for every new session. Is it possible to auto-detect or use some kind of heuristics to find the best template?
I notice Llama 3 instruct doesn't work as well with the default 'Chat RP' prompt. Every time. :-)
The text was updated successfully, but these errors were encountered: