Skip to content

Commit 30e2eb6

Browse files
authored
Merge branch 'main' into CustomParamsSchema
2 parents 9e03cd8 + d4d3ed6 commit 30e2eb6

File tree

178 files changed

+35569
-12328
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

178 files changed

+35569
-12328
lines changed

.github/workflows/cicd.yml

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -50,6 +50,8 @@ jobs:
5050

5151
- name: Run tests
5252
run: bundle exec rspec
53+
env:
54+
OLLAMA_API_BASE: http://localhost:11434/v1 # dummy
5355

5456
- name: Upload coverage to Codecov
5557
uses: codecov/codecov-action@v5
@@ -118,8 +120,8 @@ jobs:
118120
- name: Test with real APIs before publishing
119121
if: steps.check_version.outputs.version_changed == 'true'
120122
run: |
121-
echo "Removing all VCR cassettes to test against real APIs..."
122-
rm -rf spec/fixtures/vcr_cassettes
123+
echo "Removing VCR cassettes except Ollama ones..."
124+
find spec/fixtures/vcr_cassettes -type f ! -name '*ollama*' -delete
123125
124126
echo "Running tests with real API calls..."
125127
env -u CI bundle exec rspec --fail-fast
@@ -128,6 +130,8 @@ jobs:
128130
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
129131
GEMINI_API_KEY: ${{ secrets.GEMINI_API_KEY }}
130132
DEEPSEEK_API_KEY: ${{ secrets.DEEPSEEK_API_KEY }}
133+
OPENROUTER_API_KEY: ${{ secrets.OPENROUTER_API_KEY }}
134+
OLLAMA_API_BASE: http://localhost:11434/v1 # dummy
131135
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
132136
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
133137

.overcommit.yml

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,11 @@ PreCommit:
1818
exclude:
1919
- '**/db/structure.sql' # Ignore trailing whitespace in generated files
2020

21+
RakeTarget:
22+
enabled: true
23+
targets: ['models:update', 'models:docs']
24+
on_warn: fail
25+
2126
PostCheckout:
2227
ALL: # Special hook name that customizes all hooks of this type
2328
quiet: true # Change all post-checkout hooks to only display output on failure

README.md

Lines changed: 80 additions & 133 deletions
Original file line numberDiff line numberDiff line change
@@ -1,44 +1,44 @@
11
<img src="/docs/assets/images/logotype.svg" alt="RubyLLM" height="120" width="250">
22

3-
A delightful Ruby way to work with AI. No configuration madness, no complex callbacks, no handler hell – just beautiful, expressive Ruby code.
4-
5-
<div style="display: flex; align-items: center; flex-wrap: wrap; margin-bottom: 1em">
6-
<img src="https://upload.wikimedia.org/wikipedia/commons/4/4d/OpenAI_Logo.svg" alt="OpenAI" height="40" width="120">
7-
&nbsp;&nbsp;
8-
<img src="https://upload.wikimedia.org/wikipedia/commons/7/78/Anthropic_logo.svg" alt="Anthropic" height="40" width="120">
9-
&nbsp;&nbsp;
10-
<img src="https://upload.wikimedia.org/wikipedia/commons/8/8a/Google_Gemini_logo.svg" alt="Google" height="40" width="120">
11-
&nbsp;&nbsp;
12-
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/bedrock-color.svg" alt="Bedrock" height="40">
13-
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/bedrock-text.svg" alt="Bedrock" height="40" width="120">
14-
&nbsp;&nbsp;
15-
<img src="https://upload.wikimedia.org/wikipedia/commons/e/ec/DeepSeek_logo.svg" alt="DeepSeek" height="40" width="120">
3+
**A delightful Ruby way to work with AI.** RubyLLM provides **one** beautiful, Ruby-like interface to interact with modern AI models. Chat, generate images, create embeddings, and use tools – all with clean, expressive code that feels like Ruby, not like patching together multiple services.
4+
5+
<div class="provider-icons">
6+
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/anthropic-text.svg" alt="Anthropic" class="logo-small">
7+
&nbsp;
8+
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/bedrock-color.svg" alt="Bedrock" class="logo-medium">
9+
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/bedrock-text.svg" alt="Bedrock" class="logo-small">
10+
&nbsp;
11+
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/deepseek-color.svg" alt="DeepSeek" class="logo-medium">
12+
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/deepseek-text.svg" alt="DeepSeek" class="logo-small">
13+
&nbsp;
14+
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/gemini-brand-color.svg" alt="Gemini" class="logo-large">
15+
&nbsp;
16+
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/ollama.svg" alt="Ollama" class="logo-medium">
17+
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/ollama-text.svg" alt="Ollama" class="logo-medium">
18+
&nbsp;
19+
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/openai.svg" alt="OpenAI" class="logo-medium">
20+
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/openai-text.svg" alt="OpenAI" class="logo-medium">
21+
&nbsp;
22+
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/openrouter.svg" alt="OpenRouter" class="logo-medium">
23+
<img src="https://registry.npmmirror.com/@lobehub/icons-static-svg/latest/files/icons/openrouter-text.svg" alt="OpenRouter" class="logo-small">
24+
&nbsp;
1625
</div>
1726

18-
<a href="https://badge.fury.io/rb/ruby_llm"><img src="https://badge.fury.io/rb/ruby_llm.svg" alt="Gem Version" /></a>
19-
<a href="https://github.com/testdouble/standard"><img src="https://img.shields.io/badge/code_style-standard-brightgreen.svg" alt="Ruby Style Guide" /></a>
20-
<a href="https://rubygems.org/gems/ruby_llm"><img alt="Gem Downloads" src="https://img.shields.io/gem/dt/ruby_llm"></a>
21-
<a href="https://codecov.io/gh/crmne/ruby_llm"><img src="https://codecov.io/gh/crmne/ruby_llm/branch/main/graph/badge.svg" alt="codecov" /></a>
27+
<div class="badge-container">
28+
<a href="https://badge.fury.io/rb/ruby_llm"><img src="https://badge.fury.io/rb/ruby_llm.svg" alt="Gem Version" /></a>
29+
<a href="https://github.com/testdouble/standard"><img src="https://img.shields.io/badge/code_style-standard-brightgreen.svg" alt="Ruby Style Guide" /></a>
30+
<a href="https://rubygems.org/gems/ruby_llm"><img alt="Gem Downloads" src="https://img.shields.io/gem/dt/ruby_llm"></a>
31+
<a href="https://codecov.io/gh/crmne/ruby_llm"><img src="https://codecov.io/gh/crmne/ruby_llm/branch/main/graph/badge.svg" alt="codecov" /></a>
32+
</div>
2233

23-
🤺 Battle tested at [💬 Chat with Work](https://chatwithwork.com)
34+
🤺 Battle tested at [💬 Chat with Work](https://chatwithwork.com)
2435

2536
## The problem with AI libraries
2637

2738
Every AI provider comes with its own client library, its own response format, its own conventions for streaming, and its own way of handling errors. Want to use multiple providers? Prepare to juggle incompatible APIs and bloated dependencies.
2839

2940
RubyLLM fixes all that. One beautiful API for everything. One consistent format. Minimal dependencies — just Faraday and Zeitwerk. Because working with AI should be a joy, not a chore.
3041

31-
## Features
32-
33-
- 💬 **Chat** with OpenAI, Anthropic, Gemini, AWS Bedrock Anthropic, and DeepSeek models
34-
- 👁️ **Vision and Audio** understanding
35-
- 📄 **PDF Analysis** for analyzing documents
36-
- 🖼️ **Image generation** with DALL-E and other providers
37-
- 📊 **Embeddings** for vector search and semantic analysis
38-
- 🔧 **Tools** that let AI use your Ruby code
39-
- 🚂 **Rails integration** to persist chats and messages with ActiveRecord
40-
- 🌊 **Streaming** responses with proper Ruby patterns
41-
4242
## What makes it great
4343

4444
```ruby
@@ -85,142 +85,89 @@ end
8585
chat.with_tool(Weather).ask "What's the weather in Berlin? (52.5200, 13.4050)"
8686
```
8787

88+
## Core Capabilities
89+
90+
* 💬 **Unified Chat:** Converse with models from OpenAI, Anthropic, Gemini, Bedrock, OpenRouter, DeepSeek, Ollama, or any OpenAI-compatible API using `RubyLLM.chat`.
91+
* 👁️ **Vision:** Analyze images within chats.
92+
* 🔊 **Audio:** Transcribe and understand audio content.
93+
* 📄 **PDF Analysis:** Extract information and summarize PDF documents.
94+
* 🖼️ **Image Generation:** Create images with `RubyLLM.paint`.
95+
* 📊 **Embeddings:** Generate text embeddings for vector search with `RubyLLM.embed`.
96+
* 🔧 **Tools (Function Calling):** Let AI models call your Ruby code using `RubyLLM::Tool`.
97+
* 🚂 **Rails Integration:** Easily persist chats, messages, and tool calls using `acts_as_chat` and `acts_as_message`.
98+
* 🌊 **Streaming:** Process responses in real-time with idiomatic Ruby blocks.
99+
88100
## Installation
89101

102+
Add to your Gemfile:
90103
```ruby
91-
# In your Gemfile
92104
gem 'ruby_llm'
93-
94-
# Then run
95-
bundle install
96-
97-
# Or install it yourself
98-
gem install ruby_llm
99105
```
106+
Then `bundle install`.
100107

101-
Configure with your API keys:
102-
108+
Configure your API keys (using environment variables is recommended):
103109
```ruby
110+
# config/initializers/ruby_llm.rb or similar
104111
RubyLLM.configure do |config|
105112
config.openai_api_key = ENV.fetch('OPENAI_API_KEY', nil)
106-
config.anthropic_api_key = ENV.fetch('ANTHROPIC_API_KEY', nil)
107-
config.gemini_api_key = ENV.fetch('GEMINI_API_KEY', nil)
108-
config.deepseek_api_key = ENV.fetch('DEEPSEEK_API_KEY', nil)
109-
110-
# Bedrock
111-
config.bedrock_api_key = ENV.fetch('AWS_ACCESS_KEY_ID', nil)
112-
config.bedrock_secret_key = ENV.fetch('AWS_SECRET_ACCESS_KEY', nil)
113-
config.bedrock_region = ENV.fetch('AWS_REGION', nil)
114-
config.bedrock_session_token = ENV.fetch('AWS_SESSION_TOKEN', nil)
113+
# Add keys ONLY for providers you intend to use
114+
# config.anthropic_api_key = ENV.fetch('ANTHROPIC_API_KEY', nil)
115+
# ... see Configuration guide for all options ...
115116
end
116117
```
118+
See the [Installation Guide](https://rubyllm.com/installation) for full details.
117119

118-
## Have great conversations
119-
120-
```ruby
121-
# Start a chat with the default model (gpt-4.1-nano)
122-
chat = RubyLLM.chat
123-
124-
# Or specify what you want
125-
chat = RubyLLM.chat(model: 'claude-3-7-sonnet-20250219')
126-
127-
# Simple questions just work
128-
chat.ask "What's the difference between attr_reader and attr_accessor?"
129-
130-
# Multi-turn conversations are seamless
131-
chat.ask "Could you give me an example?"
132-
133-
# Stream responses in real-time
134-
chat.ask "Tell me a story about a Ruby programmer" do |chunk|
135-
print chunk.content
136-
end
137-
138-
# Set personality or behavior with instructions (aka system prompts)
139-
chat.with_instructions "You are a friendly Ruby expert who loves to help beginners"
140-
141-
# Understand content in multiple forms
142-
chat.ask "Compare these diagrams", with: { image: ["diagram1.png", "diagram2.png"] }
143-
chat.ask "Summarize this document", with: { pdf: "contract.pdf" }
144-
chat.ask "What's being said?", with: { audio: "meeting.wav" }
145-
146-
# Need a different model mid-conversation? No problem
147-
chat.with_model('gemini-2.0-flash').ask "What's your favorite algorithm?"
148-
```
120+
## Rails Integration
149121

150-
## Rails integration that makes sense
122+
Add persistence to your chat models effortlessly:
151123

152124
```ruby
153125
# app/models/chat.rb
154126
class Chat < ApplicationRecord
155-
acts_as_chat
156-
157-
# Works great with Turbo
158-
broadcasts_to ->(chat) { "chat_#{chat.id}" }
127+
acts_as_chat # Automatically saves messages & tool calls
128+
# ... your other model logic ...
159129
end
160130

161131
# app/models/message.rb
162132
class Message < ApplicationRecord
163133
acts_as_message
134+
# ...
164135
end
165136

166-
# app/models/tool_call.rb
137+
# app/models/tool_call.rb (if using tools)
167138
class ToolCall < ApplicationRecord
168139
acts_as_tool_call
140+
# ...
169141
end
170142

171-
# In a background job
172-
chat = Chat.create! model_id: "gpt-4.1-nano"
173-
174-
# Set personality or behavior with instructions (aka system prompts) - they're persisted too!
175-
chat.with_instructions "You are a friendly Ruby expert who loves to help beginners"
176-
177-
chat.ask("What's your favorite Ruby gem?") do |chunk|
178-
Turbo::StreamsChannel.broadcast_append_to(
179-
chat,
180-
target: "response",
181-
partial: "messages/chunk",
182-
locals: { chunk: chunk }
183-
)
184-
end
185-
186-
# That's it - chat history is automatically saved
143+
# Now interacting with a Chat record persists the conversation:
144+
chat_record = Chat.create!(model_id: "gpt-4.1-nano")
145+
chat_record.ask("Explain Active Record callbacks.") # User & Assistant messages saved
187146
```
188-
189-
## Creating tools is a breeze
190-
191-
```ruby
192-
class Search < RubyLLM::Tool
193-
description "Searches a knowledge base"
194-
195-
param :query, desc: "The search query"
196-
param :limit, type: :integer, desc: "Max results", required: false
197-
198-
def execute(query:, limit: 5)
199-
# Your search logic here
200-
Document.search(query).limit(limit).map(&:title)
201-
end
202-
end
203-
204-
# Let the AI use it
205-
chat.with_tool(Search).ask "Find documents about Ruby 3.3 features"
206-
```
207-
208-
## Learn more
209-
210-
Check out the guides at https://rubyllm.com for deeper dives into conversations with tools, streaming responses, embedding generations, and more.
147+
Check the [Rails Integration Guide](https://rubyllm.com/guides/rails) for more.
148+
149+
## Learn More
150+
151+
Dive deeper with the official documentation:
152+
153+
- [Installation](https://rubyllm.com/installation)
154+
- [Configuration](https://rubyllm.com/configuration)
155+
- **Guides:**
156+
- [Getting Started](https://rubyllm.com/guides/getting-started)
157+
- [Chatting with AI Models](https://rubyllm.com/guides/chat)
158+
- [Using Tools](https://rubyllm.com/guides/tools)
159+
- [Streaming Responses](https://rubyllm.com/guides/streaming)
160+
- [Rails Integration](https://rubyllm.com/guides/rails)
161+
- [Image Generation](https://rubyllm.com/guides/image-generation)
162+
- [Embeddings](https://rubyllm.com/guides/embeddings)
163+
- [Working with Models](https://rubyllm.com/guides/models)
164+
- [Error Handling](https://rubyllm.com/guides/error-handling)
165+
- [Available Models](https://rubyllm.com/guides/available-models)
211166

212167
## Contributing
213168

214-
We welcome contributions to RubyLLM!
215-
216-
See [CONTRIBUTING.md](CONTRIBUTING.md) for detailed instructions on how to:
217-
- Run the test suite
218-
- Add new features
219-
- Update documentation
220-
- Re-record VCR cassettes when needed
221-
222-
We appreciate your help making RubyLLM better!
169+
We welcome contributions! Please see [CONTRIBUTING.md](CONTRIBUTING.md) for details on setup, testing, and contribution guidelines.
223170

224171
## License
225172

226-
Released under the MIT License.
173+
Released under the MIT License.

bin/console

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,8 @@ RubyLLM.configure do |config|
1212
config.anthropic_api_key = ENV.fetch('ANTHROPIC_API_KEY', nil)
1313
config.gemini_api_key = ENV.fetch('GEMINI_API_KEY', nil)
1414
config.deepseek_api_key = ENV.fetch('DEEPSEEK_API_KEY', nil)
15+
config.openrouter_api_key = ENV.fetch('OPENROUTER_API_KEY', nil)
16+
config.ollama_api_base = ENV.fetch('OLLAMA_API_BASE', nil)
1517
config.bedrock_api_key = ENV.fetch('AWS_ACCESS_KEY_ID', nil)
1618
config.bedrock_secret_key = ENV.fetch('AWS_SECRET_ACCESS_KEY', nil)
1719
config.bedrock_region = ENV.fetch('AWS_REGION', nil)

docs/_sass/custom/custom.scss

Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,39 @@
1+
.logo-container {
2+
display: flex;
3+
align-items: center;
4+
flex-wrap: wrap;
5+
gap: 1em;
6+
}
7+
8+
.provider-icons {
9+
display: flex;
10+
align-items: center;
11+
flex-wrap: wrap;
12+
gap: 1em;
13+
margin-bottom: 1em;
14+
}
15+
16+
.provider-logo {
17+
display: flex;
18+
align-items: center;
19+
gap: 0.2em;
20+
}
21+
22+
.logo-small {
23+
height: 20px;
24+
}
25+
26+
.logo-medium {
27+
height: 30px;
28+
}
29+
30+
.logo-large {
31+
height: 50px;
32+
}
33+
34+
.badge-container {
35+
display: flex;
36+
align-items: center;
37+
flex-wrap: wrap;
38+
gap: 0.2em;
39+
}

0 commit comments

Comments
 (0)