Skip to content

Commit a4ddf87

Browse files
committed
Add documentation structure and configuration for RubyLLM; include installation and guides
1 parent 971857d commit a4ddf87

18 files changed

+2510
-1
lines changed

.github/workflows/docs.yml

+53
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
name: Deploy docs
2+
3+
on:
4+
push:
5+
branches: [main]
6+
paths:
7+
- 'docs/**'
8+
- '.github/workflows/docs.yml'
9+
workflow_dispatch:
10+
11+
permissions:
12+
contents: read
13+
pages: write
14+
id-token: write
15+
16+
concurrency:
17+
group: "pages"
18+
cancel-in-progress: false
19+
20+
jobs:
21+
build:
22+
runs-on: ubuntu-latest
23+
steps:
24+
- name: Checkout
25+
uses: actions/checkout@v4
26+
- name: Setup Ruby
27+
uses: ruby/setup-ruby@v1
28+
with:
29+
ruby-version: '3.3'
30+
bundler-cache: true
31+
working-directory: docs
32+
- name: Setup Pages
33+
uses: actions/configure-pages@v4
34+
- name: Build with Jekyll
35+
working-directory: docs
36+
run: bundle exec jekyll build --baseurl "${{ steps.pages.outputs.base_path }}"
37+
env:
38+
JEKYLL_ENV: production
39+
- name: Upload artifact
40+
uses: actions/upload-pages-artifact@v3
41+
with:
42+
path: docs/_site
43+
44+
deploy:
45+
environment:
46+
name: github-pages
47+
url: ${{ steps.deployment.outputs.page_url }}
48+
runs-on: ubuntu-latest
49+
needs: build
50+
steps:
51+
- name: Deploy to GitHub Pages
52+
id: deployment
53+
uses: actions/deploy-pages@v4

.rubocop.yml

+3-1
Original file line numberDiff line numberDiff line change
@@ -2,4 +2,6 @@ require:
22
- rubocop-rake
33

44
AllCops:
5-
TargetRubyVersion: 3.1
5+
TargetRubyVersion: 3.1
6+
Exclude:
7+
- docs/**/*

.yardopts

+12
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
--markup markdown
2+
--markup-provider redcarpet
3+
--title "RubyLLM API Documentation"
4+
--protected
5+
--private
6+
--embed-mixins
7+
--output-dir doc/yard
8+
--readme README.md
9+
lib/**/*.rb
10+
-
11+
LICENSE
12+

docs/.gitignore

+7
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
_site/
2+
.sass-cache/
3+
.jekyll-cache/
4+
.jekyll-metadata
5+
# Ignore folders generated by Bundler
6+
.bundle/
7+
vendor/

docs/Gemfile

+11
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
source 'https://rubygems.org'
2+
3+
gem 'jekyll', '~> 4.3'
4+
gem 'just-the-docs', '~> 0.7.0'
5+
gem 'webrick', '~> 1.8'
6+
7+
# GitHub Pages plugins
8+
group :jekyll_plugins do
9+
gem 'jekyll-remote-theme'
10+
gem 'jekyll-seo-tag'
11+
end

docs/_config.yml

+43
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
title: RubyLLM
2+
description: A delightful Ruby way to work with AI
3+
url: https://crmne.github.io/ruby_llm
4+
baseurl: /ruby_llm
5+
remote_theme: just-the-docs/just-the-docs
6+
7+
# Enable search
8+
search_enabled: true
9+
search:
10+
heading_level: 2
11+
previews: 3
12+
preview_words_before: 5
13+
preview_words_after: 10
14+
tokenizer_separator: /[\s/]+/
15+
rel_url: true
16+
button: false
17+
18+
# Navigation structure
19+
nav_external_links:
20+
- title: RubyLLM on GitHub
21+
url: https://github.com/crmne/ruby_llm
22+
hide_icon: false
23+
24+
# Footer content
25+
footer_content: "Copyright &copy; 2025 <a href='https://paolino.me'>Carmine Paolino</a>. Distributed under an <a href=\"https://github.com/crmne/ruby_llm/tree/main/LICENSE\">MIT license.</a>"
26+
27+
# Enable copy button on code blocks
28+
enable_copy_code_button: true
29+
30+
# Make Anchor links show on hover
31+
heading_anchors: true
32+
33+
# Color scheme
34+
color_scheme: light
35+
36+
# Google Analytics
37+
ga_tracking:
38+
ga_tracking_anonymize_ip: true
39+
40+
# Custom plugins (GitHub Pages allows these)
41+
plugins:
42+
- jekyll-remote-theme
43+
- jekyll-seo-tag

docs/_data/navigation.yml

+25
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
- title: Home
2+
url: /
3+
- title: Installation
4+
url: /installation
5+
- title: Guides
6+
url: /guides/
7+
subfolderitems:
8+
- title: Getting Started
9+
url: /guides/getting-started
10+
- title: Chat
11+
url: /guides/chat
12+
- title: Tools
13+
url: /guides/tools
14+
- title: Streaming
15+
url: /guides/streaming
16+
- title: Rails Integration
17+
url: /guides/rails
18+
- title: Image Generation
19+
url: /guides/image-generation
20+
- title: Embeddings
21+
url: /guides/embeddings
22+
- title: Error Handling
23+
url: /guides/error-handling
24+
- title: GitHub
25+
url: https://github.com/crmne/ruby_llm

docs/guides/chat.md

+206
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,206 @@
1+
---
2+
layout: default
3+
title: Chat
4+
parent: Guides
5+
nav_order: 2
6+
permalink: /guides/chat
7+
---
8+
9+
# Chatting with AI Models
10+
11+
RubyLLM's chat interface provides a natural way to interact with various AI models. This guide covers everything from basic chatting to advanced features like multimodal inputs and streaming responses.
12+
13+
## Basic Chat
14+
15+
Creating a chat and asking questions is straightforward:
16+
17+
```ruby
18+
# Create a chat with the default model
19+
chat = RubyLLM.chat
20+
21+
# Ask a question
22+
response = chat.ask "What's the best way to learn Ruby?"
23+
24+
# The response is a Message object
25+
puts response.content
26+
puts "Role: #{response.role}"
27+
puts "Model: #{response.model_id}"
28+
puts "Tokens: #{response.input_tokens} input, #{response.output_tokens} output"
29+
```
30+
31+
## Choosing Models
32+
33+
You can specify which model to use when creating a chat:
34+
35+
```ruby
36+
# Create a chat with a specific model
37+
chat = RubyLLM.chat(model: 'gpt-4o-mini')
38+
39+
# Use Claude instead
40+
claude_chat = RubyLLM.chat(model: 'claude-3-5-sonnet-20241022')
41+
42+
# Or change the model for an existing chat
43+
chat.with_model('gemini-2.0-flash')
44+
```
45+
46+
## Multi-turn Conversations
47+
48+
Chats maintain conversation history automatically:
49+
50+
```ruby
51+
chat = RubyLLM.chat
52+
53+
# Start a conversation
54+
chat.ask "What's your favorite programming language?"
55+
56+
# Follow up
57+
chat.ask "Why do you like that language?"
58+
59+
# Continue the conversation
60+
chat.ask "What are its weaknesses?"
61+
62+
# Access the conversation history
63+
chat.messages.each do |message|
64+
puts "#{message.role}: #{message.content[0..50]}..."
65+
end
66+
```
67+
68+
## Working with Images
69+
70+
Vision-capable models can understand images:
71+
72+
```ruby
73+
chat = RubyLLM.chat
74+
75+
# Ask about an image (local file)
76+
chat.ask "What's in this image?", with: { image: "path/to/image.jpg" }
77+
78+
# Or use an image URL
79+
chat.ask "Describe this picture", with: { image: "https://example.com/image.jpg" }
80+
81+
# Include multiple images
82+
chat.ask "Compare these two charts", with: {
83+
image: ["chart1.png", "chart2.png"]
84+
}
85+
86+
# Combine text and image
87+
chat.ask "Is this the Ruby logo?", with: { image: "logo.png" }
88+
```
89+
90+
## Working with Audio
91+
92+
Models with audio capabilities can process spoken content:
93+
94+
```ruby
95+
chat = RubyLLM.chat(model: 'gpt-4o-audio-preview')
96+
97+
# Analyze audio content
98+
chat.ask "What's being said in this recording?", with: {
99+
audio: "meeting.wav"
100+
}
101+
102+
# Ask follow-up questions about the audio
103+
chat.ask "Summarize the key points mentioned"
104+
```
105+
106+
## Streaming Responses
107+
108+
For a more interactive experience, you can stream responses as they're generated:
109+
110+
```ruby
111+
chat = RubyLLM.chat
112+
113+
# Stream the response with a block
114+
chat.ask "Tell me a story about a Ruby programmer" do |chunk|
115+
# Each chunk is a partial response
116+
print chunk.content
117+
$stdout.flush # Ensure output is displayed immediately
118+
end
119+
120+
# Useful for long responses or real-time displays
121+
chat.ask "Write a detailed essay about programming paradigms" do |chunk|
122+
add_to_ui(chunk.content) # Your method to update UI
123+
end
124+
```
125+
126+
## Temperature Control
127+
128+
Control the creativity and randomness of AI responses:
129+
130+
```ruby
131+
# Higher temperature (more creative)
132+
creative_chat = RubyLLM.chat.with_temperature(0.9)
133+
creative_chat.ask "Write a poem about Ruby programming"
134+
135+
# Lower temperature (more deterministic)
136+
precise_chat = RubyLLM.chat.with_temperature(0.1)
137+
precise_chat.ask "Explain how Ruby's garbage collector works"
138+
```
139+
140+
## Access Token Usage
141+
142+
RubyLLM automatically tracks token usage for billing and quota management:
143+
144+
```ruby
145+
chat = RubyLLM.chat
146+
response = chat.ask "Explain quantum computing"
147+
148+
# Check token usage
149+
puts "Input tokens: #{response.input_tokens}"
150+
puts "Output tokens: #{response.output_tokens}"
151+
puts "Total tokens: #{response.input_tokens + response.output_tokens}"
152+
153+
# Estimate cost (varies by model)
154+
model = RubyLLM.models.find(response.model_id)
155+
input_cost = response.input_tokens * model.input_price_per_million / 1_000_000
156+
output_cost = response.output_tokens * model.output_price_per_million / 1_000_000
157+
puts "Estimated cost: $#{(input_cost + output_cost).round(6)}"
158+
```
159+
160+
## Registering Event Handlers
161+
162+
You can register callbacks for chat events:
163+
164+
```ruby
165+
chat = RubyLLM.chat
166+
167+
# Called when a new assistant message starts
168+
chat.on_new_message do
169+
puts "Assistant is typing..."
170+
end
171+
172+
# Called when a message is complete
173+
chat.on_end_message do |message|
174+
puts "Response complete!"
175+
puts "Used #{message.input_tokens + message.output_tokens} tokens"
176+
end
177+
178+
# These callbacks work with both streaming and non-streaming responses
179+
chat.ask "Tell me about Ruby's history"
180+
```
181+
182+
## Multiple Parallel Chats
183+
184+
You can maintain multiple separate chat instances:
185+
186+
```ruby
187+
# Create multiple chat instances
188+
ruby_chat = RubyLLM.chat
189+
python_chat = RubyLLM.chat
190+
191+
# Each has its own conversation history
192+
ruby_chat.ask "What's great about Ruby?"
193+
python_chat.ask "What's great about Python?"
194+
195+
# Continue separate conversations
196+
ruby_chat.ask "How does Ruby handle metaprogramming?"
197+
python_chat.ask "How does Python handle decorators?"
198+
```
199+
200+
## Next Steps
201+
202+
Now that you understand chat basics, you might want to explore:
203+
204+
- [Using Tools]({% link guides/tools.md %}) to let AI use your Ruby code
205+
- [Streaming Responses]({% link guides/streaming.md %}) for real-time interactions
206+
- [Rails Integration]({% link guides/rails.md %}) to persist conversations in your apps

0 commit comments

Comments
 (0)