You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be nice to be able to include multiple OpenAI API-compatible models, for example, one hosted by OpenAI and another hosted by Azure (GitHub Models). To my knowledge, it is only possible to consider a single OpenAI API-compatible model, though the OPENAI.url can be set to non-OpenAI hosts (like Azure).
Why?
The Ollama (local) configuration permits multiple models. There are many diverse OpenAI API-compatible models that may provide different commit messages and users may want to consider those outputs.
Alternatives
No response
Additional context
No response
The text was updated successfully, but these errors were encountered:
I understand your request. It would be good to support multiple models simultaneously by extending the OpenAI based code..
I'll consider implementing support for multiple models using custom URL configurations (CUSTOM.url, CUSTOM2.url, etc.). Thank you for the feature request.
Feature request
It would be nice to be able to include multiple OpenAI API-compatible models, for example, one hosted by OpenAI and another hosted by Azure (GitHub Models). To my knowledge, it is only possible to consider a single OpenAI API-compatible model, though the
OPENAI.url
can be set to non-OpenAI hosts (like Azure).Why?
The Ollama (local) configuration permits multiple models. There are many diverse OpenAI API-compatible models that may provide different commit messages and users may want to consider those outputs.
Alternatives
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: