Skip to content

Add print to assistant_chat.rb example #868

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

mattlindsey
Copy link
Contributor

Gives user ability to 'print' the assistants messages, which I found useful to better understand how the Assistant works and/or to test a tool.

Note that Ollama gives me a 404 error me even though there appears to be support for it, and it works from another application.
Shouldn't this work?

  # Use local Ollama. See https://ollama.com/search?c=tools for models that support tools.
  llm = Langchain::LLM::Ollama.new(default_options: {chat_model: "mistral"})
Welcome to your Meteorological assistant!

(multiline input; type 'end' on its own line when done. or 'print' to print messages, or 'exit' to exit)

Query: Boston?
Query: end
D, [2024-11-10T06:27:11.902641 #34124] DEBUG -- [Langchain.rb]: Langchain::Assistant - Sending a call to Langchain::LLM::Ollama
W, [2024-11-10T06:27:11.902826 #34124]  WARN -- [Langchain.rb]: WARNING: `parallel_tool_calls:` is not supported by Ollama currently
W, [2024-11-10T06:27:11.902856 #34124]  WARN -- [Langchain.rb]: WARNING: `tool_choice:` is not supported by Ollama currently
I, [2024-11-10T06:27:11.904720 #34124]  INFO -- [request]: POST http://localhost:11434/api/chat
I, [2024-11-10T06:27:11.904778 #34124]  INFO -- [request]: User-Agent: "Faraday v2.12.0"
Content-Type: "application/json"
I, [2024-11-10T06:27:11.904804 #34124]  INFO -- [request]: {"messages":[{"role":"system","content":"You are a Meteorologist Assistant that is able to report the weather for any city in metric units."},{"role":"user","content":"Boston?"}],"model":"mistral","stream":false,"tools":[{"type":"function","function":{"name":"langchain_tool_weather__get_current_weather","description":"Returns current weather for a city","parameters":{"type":"object","properties":{"city":{"type":"string","description":"City name"},"state_code":{"type":"string","description":"State code"},"country_code":{"type":"string","description":"Country code"},"units":{"type":"string","description":"Units for temperature (imperial or metric). Default: \"imperial\"","enum":["imperial","metric","standard"]}},"required":["city","state_code"]}}}],"temperature":0.0}
I, [2024-11-10T06:27:11.909847 #34124]  INFO -- [response]: Status 404
I, [2024-11-10T06:27:11.909904 #34124]  INFO -- [response]: content-type: "application/json; charset=utf-8"
date: "Sun, 10 Nov 2024 11:27:11 GMT"
content-length: "61"
I, [2024-11-10T06:27:11.909932 #34124]  INFO -- [response]:
/Users/mattlindsey/.rvm/gems/ruby-3.2.1/gems/faraday-2.12.0/lib/faraday/response/raise_error.rb:30:in `on_complete': the server responded with status 404 (Faraday::ResourceNotFound)
	from /Users/mattlindsey/.rvm/gems/ruby-3.2.1/gems/faraday-2.12.0/lib/faraday/middleware.rb:57:in `block in call'

I do see it hitting it in the Ollama log:

[GIN] 2024/11/10 - 06:34:41 | 404 |     1.26275ms |       127.0.0.1 | POST     "/api/chat"

@mattlindsey
Copy link
Contributor Author

Nevermind the Ollama 404 error. I guess that's what you get if you haven't downloaded the model.

This worked:

llm = Langchain::LLM::Ollama.new(default_options: {chat_model: "llama3.1:8b"})

@andreibondarev andreibondarev requested a review from Copilot April 17, 2025 18:21
Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copilot reviewed 1 out of 1 changed files in this pull request and generated no comments.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants