You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[Feature]: Add a configurable backend endpoint that receives the gitDiff for LLM services that rely on module files (system prompt on the LLM end)
#386
The feature consists of addiing the possibility to communicate with a backend that does not receive system prompts since it relies on module files.
Suggested Solution
Adding a subclass called llmservice that implements AiEngine and is responsible of communicating with such services. In addition to that adding the corresponding configuration variables.
Alternatives
No response
Additional Context
No response
The text was updated successfully, but these errors were encountered:
Description
The feature consists of addiing the possibility to communicate with a backend that does not receive system prompts since it relies on module files.
Suggested Solution
Adding a subclass called llmservice that implements AiEngine and is responsible of communicating with such services. In addition to that adding the corresponding configuration variables.
Alternatives
No response
Additional Context
No response
The text was updated successfully, but these errors were encountered: