Ollama Chat Completion not invoking plugin functions #9547
-
Hi, I have been investigating using Ollama chat completion. I am struggling to get plugin functions to exectue when using Ollama chat completion. Everything works as expected when using AzureOpenAIChatCompletion. Is invoking of plugin functions supported with OllamaChatCompletion? Thanks, Euan |
Beta Was this translation helpful? Give feedback.
Replies: 7 comments 7 replies
-
Example project I am testing with: |
Beta Was this translation helpful? Give feedback.
-
Thanks for reaching out, we are currently working on this. I'm tagging @RogerBarreto @RogerBarret0 to elaborate. |
Beta Was this translation helpful? Give feedback.
-
Hi I am having this issue too any release date on this enhancement? |
Beta Was this translation helpful? Give feedback.
-
@RogerBarreto I tried with the latest packages (version 1.30) and function calling did not work for me. I am getting responses using Ollama but functions are not executed. |
Beta Was this translation helpful? Give feedback.
-
@RogerBarreto tested with v1.32 and the plugin is still not getting invoked. Do you know what version the PR above is likely to be included in? Attached project shows an example of the issue. Using AddAzureOpenAIChatCompletion the plugin is invoked, using AddOllamaChatCompletion plugin does not get invoked. |
Beta Was this translation helpful? Give feedback.
-
I am not able to make it working too.
I cannot hit a breakpoint in the plugin. |
Beta Was this translation helpful? Give feedback.
-
Example ollama function calling: https://github.com/microsoft/semantic-kernel/blob/main/dotnet/samples/Demos/OllamaFunctionCalling/Program.cs First, you need add system prompt to your model for add function calling ("tools"): for quen2.5 family look this url, for llama 3.1 example prompt for ollama model and etc. For other types of models (Phi3, etc.) you need to search another prompts to add function calls. You can add your model with your system prompt (with function calling support) to ollama, see url.
If model not include function calling prompt (see "Modelfile"), ollama server print message "registry.ollama.ai MYMODEL does not support tools". Weak models usually don't support calling functions even if you use this prompt, because they don't think well, see this url Sorry for my english, translation from google. |
Beta Was this translation helpful? Give feedback.
@eeadie Currently our Ollama dedicated connector doesn't support function calling.
We have the PR below adding that support for our Ollama Connector, keep posted, potentially available in our next release.