Skip to content
This repository was archived by the owner on Feb 21, 2025. It is now read-only.

BYOK ollama selfhost #584

Open
ukrolelo opened this issue Feb 5, 2025 · 0 comments
Open

BYOK ollama selfhost #584

ukrolelo opened this issue Feb 5, 2025 · 0 comments

Comments

@ukrolelo
Copy link

ukrolelo commented Feb 5, 2025

chat_endpoint: "http://localhost:11434/v1/chat/completions"
chat_apikey: "ollama"
chat_model: "qwen2.5-coder:7b"
and
chat_model: "deepseek-r1:8b-llama-distill-q8_0"

Error: Bad Request
Click to retry

P.S. tested ollama from console, it's working
curl http://localhost:11434/v1/chat/completions
-H "Content-Type: application/json"
-d '{
"model": "qwen2.5-coder:7b",
"messages": [{"role": "user", "content": "Hello, how are you?"}]
}'

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant