Tool calling via Ollama #2212
Closed
snscaimito
started this conversation in
General
Replies: 2 comments 1 reply
-
I did experiment some more and so far found that with llama3.3 it doesn't "like" to call functions/tools. gpt-4o does it flawlessly. Any suggestion for a local model that is good at calling functions/tools? |
Beta Was this translation helpful? Give feedback.
0 replies
-
Some Ollama models are configured wrong when it comes to tool calling. In particular, |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi!
I've been using OpenAI's models through the Spring AI Starter for OpenAI. Now I want to explore the use of local models.
My understanding is that I can use Ollama as an interface to different models. I've tried DeepSeek but learned that it does not support tools. I've tried
llama3.2
and nowmistral
like so:With this code:
I'm getting back:
This is not what I expected. I also see in the logs that it tries to do a lot with JSON. My tools don't provide JSON in particular but things formatted as String using
toString()
. That lazy approach worked with OpenAI's models.Some pointers in the right direction would be helpful. Thanks!
Beta Was this translation helpful? Give feedback.
All reactions