Skip to content

Commit 0b4ae98

Browse files
authored
Merge branch 'crmne:main' into main
2 parents 9f0215c + d2f0604 commit 0b4ae98

File tree

352 files changed

+15417
-35046
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

352 files changed

+15417
-35046
lines changed

.overcommit.yml

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -18,11 +18,6 @@ PreCommit:
1818
exclude:
1919
- '**/db/structure.sql' # Ignore trailing whitespace in generated files
2020

21-
RakeTarget:
22-
enabled: true
23-
command: ['bundle', 'exec', 'rake']
24-
targets: ['models:update', 'models:docs', 'aliases:generate']
25-
on_warn: fail
2621

2722
AppraisalUpdate:
2823
enabled: true

Gemfile

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,7 @@ group :development do # rubocop:disable Metrics/BlockLength
1212
gem 'dotenv'
1313
gem 'ferrum'
1414
gem 'flay'
15+
gem 'image_processing', '~> 1.2'
1516
gem 'irb'
1617
gem 'json-schema'
1718
gem 'nokogiri'

README.md

Lines changed: 73 additions & 91 deletions
Original file line numberDiff line numberDiff line change
@@ -1,99 +1,114 @@
1+
<div align="center">
2+
13
<picture>
24
<source media="(prefers-color-scheme: dark)" srcset="/docs/assets/images/logotype_dark.svg">
35
<img src="/docs/assets/images/logotype.svg" alt="RubyLLM" height="120" width="250">
46
</picture>
57

6-
**One *beautiful* Ruby API for GPT, Claude, Gemini, and more.** Easily build chatbots, AI agents, RAG applications, and content generators. Features chat (text, images, audio, PDFs), image generation, embeddings, tools (function calling), structured output, Rails integration, and streaming. Works with OpenAI, Anthropic, Google Gemini, AWS Bedrock, DeepSeek, Mistral, Ollama (local models), OpenRouter, Perplexity, GPUStack, and any OpenAI-compatible API.
8+
<strong>One *beautiful* Ruby API for GPT, Claude, Gemini, and more.</strong>
9+
10+
Battle tested at [<picture><source media="(prefers-color-scheme: dark)" srcset="https://chatwithwork.com/logotype-dark.svg"><img src="https://chatwithwork.com/logotype.svg" alt="Chat with Work" height="30" align="absmiddle"></picture>](https://chatwithwork.com)*Claude Code for your documents*
11+
12+
[![Gem Version](https://badge.fury.io/rb/ruby_llm.svg?a=6)](https://badge.fury.io/rb/ruby_llm)
13+
[![Ruby Style Guide](https://img.shields.io/badge/code_style-standard-brightgreen.svg)](https://github.com/testdouble/standard)
14+
[![Gem Downloads](https://img.shields.io/gem/dt/ruby_llm)](https://rubygems.org/gems/ruby_llm)
15+
[![codecov](https://codecov.io/gh/crmne/ruby_llm/branch/main/graph/badge.svg?a=1)](https://codecov.io/gh/crmne/ruby_llm)
716

8-
<div class="badge-container">
9-
<a href="https://badge.fury.io/rb/ruby_llm"><img src="https://badge.fury.io/rb/ruby_llm.svg?a=5" alt="Gem Version" /></a>
10-
<a href="https://github.com/testdouble/standard"><img src="https://img.shields.io/badge/code_style-standard-brightgreen.svg" alt="Ruby Style Guide" /></a>
11-
<a href="https://rubygems.org/gems/ruby_llm"><img alt="Gem Downloads" src="https://img.shields.io/gem/dt/ruby_llm"></a>
12-
<a href="https://codecov.io/gh/crmne/ruby_llm"><img src="https://codecov.io/gh/crmne/ruby_llm/branch/main/graph/badge.svg" alt="codecov" /></a>
17+
<a href="https://trendshift.io/repositories/13640" target="_blank"><img src="https://trendshift.io/api/badge/repositories/13640" alt="crmne%2Fruby_llm | Trendshift" style="width: 250px; height: 55px;" width="250" height="55"/></a>
1318
</div>
1419

15-
Battle tested at [<picture><source media="(prefers-color-scheme: dark)" srcset="https://chatwithwork.com/logotype-dark.svg"><img src="https://chatwithwork.com/logotype.svg" alt="Chat with Work" height="30" align="absmiddle"></picture>](https://chatwithwork.com)*Claude Code for your documents*
20+
---
21+
22+
Build chatbots, AI agents, RAG applications. Works with OpenAI, Anthropic, Google, AWS, local models, and any OpenAI-compatible API.
1623

17-
## The problem with AI libraries
24+
## Why RubyLLM?
1825

19-
Every AI provider comes with its own client library, its own response format, its own conventions for streaming, and its own way of handling errors. Want to use multiple providers? Prepare to juggle incompatible APIs and bloated dependencies.
26+
Every AI provider ships their own bloated client. Different APIs. Different response formats. Different conventions. It's exhausting.
2027

21-
RubyLLM fixes all that. One beautiful API for everything. One consistent format. Minimal dependencies — just Faraday, Zeitwerk, and Marcel. Because working with AI should be a joy, not a chore.
28+
RubyLLM gives you one beautiful API for all of them. Same interface whether you're using GPT, Claude, or your local Ollama. Just three dependencies: Faraday, Zeitwerk, and Marcel. That's it.
2229

23-
## What makes it great
30+
## Show me the code
2431

2532
```ruby
2633
# Just ask questions
2734
chat = RubyLLM.chat
2835
chat.ask "What's the best way to learn Ruby?"
36+
```
2937

30-
# Analyze images, audio, documents, and text files
38+
```ruby
39+
# Analyze any file type
3140
chat.ask "What's in this image?", with: "ruby_conf.jpg"
3241
chat.ask "Describe this meeting", with: "meeting.wav"
3342
chat.ask "Summarize this document", with: "contract.pdf"
3443
chat.ask "Explain this code", with: "app.rb"
44+
```
3545

36-
# Multiple files at once - types automatically detected
46+
```ruby
47+
# Multiple files at once
3748
chat.ask "Analyze these files", with: ["diagram.png", "report.pdf", "notes.txt"]
49+
```
3850

39-
# Stream responses in real-time
40-
chat.ask "Tell me a story about a Ruby programmer" do |chunk|
51+
```ruby
52+
# Stream responses
53+
chat.ask "Tell me a story about Ruby" do |chunk|
4154
print chunk.content
4255
end
56+
```
4357

58+
```ruby
4459
# Generate images
4560
RubyLLM.paint "a sunset over mountains in watercolor style"
61+
```
4662

47-
# Create vector embeddings
63+
```ruby
64+
# Create embeddings
4865
RubyLLM.embed "Ruby is elegant and expressive"
66+
```
4967

68+
```ruby
5069
# Let AI use your code
5170
class Weather < RubyLLM::Tool
52-
description "Gets current weather for a location"
53-
param :latitude, desc: "Latitude (e.g., 52.5200)"
54-
param :longitude, desc: "Longitude (e.g., 13.4050)"
71+
description "Get current weather"
72+
param :latitude
73+
param :longitude
5574

5675
def execute(latitude:, longitude:)
5776
url = "https://api.open-meteo.com/v1/forecast?latitude=#{latitude}&longitude=#{longitude}&current=temperature_2m,wind_speed_10m"
58-
59-
response = Faraday.get(url)
60-
data = JSON.parse(response.body)
61-
rescue => e
62-
{ error: e.message }
77+
JSON.parse(Faraday.get(url).body)
6378
end
6479
end
6580

66-
chat.with_tool(Weather).ask "What's the weather in Berlin? (52.5200, 13.4050)"
81+
chat.with_tool(Weather).ask "What's the weather in Berlin?"
82+
```
6783

68-
# Get structured output with JSON schemas
84+
```ruby
85+
# Get structured output
6986
class ProductSchema < RubyLLM::Schema
70-
string :name, description: "Product name"
71-
number :price, description: "Price in USD"
72-
array :features, description: "Key features" do
73-
string description: "Feature description"
87+
string :name
88+
number :price
89+
array :features do
90+
string
7491
end
7592
end
7693

77-
response = chat.with_schema(ProductSchema)
78-
.ask "Analyze this product description", with: "product.txt"
79-
# response.content => { "name" => "...", "price" => 99.99, "features" => [...] }
94+
response = chat.with_schema(ProductSchema).ask "Analyze this product", with: "product.txt"
8095
```
8196

82-
## Core Capabilities
83-
84-
* 💬 **Unified Chat:** Converse with models from OpenAI, Anthropic, Gemini, Bedrock, OpenRouter, DeepSeek, Perplexity, Mistral, Ollama, or any OpenAI-compatible API using `RubyLLM.chat`.
85-
* 👁️ **Vision:** Analyze images within chats.
86-
* 🔊 **Audio:** Transcribe and understand audio content.
87-
* 📄 **Document Analysis:** Extract information from PDFs, text files, CSV, JSON, XML, Markdown, and code files.
88-
* 🖼️ **Image Generation:** Create images with `RubyLLM.paint`.
89-
* 📊 **Embeddings:** Generate text embeddings for vector search with `RubyLLM.embed`.
90-
* 🔧 **Tools (Function Calling):** Let AI models call your Ruby code using `RubyLLM::Tool`.
91-
* 📋 **Structured Output:** Guarantee responses conform to JSON schemas with `RubyLLM::Schema`.
92-
* 🚂 **Rails Integration:** Easily persist chats, messages, and tool calls using `acts_as_chat` and `acts_as_message`.
93-
* 🌊 **Streaming:** Process responses in real-time with idiomatic Ruby blocks.
94-
* **Async Support:** Built-in fiber-based concurrency for high-performance operations.
95-
* 🎯 **Smart Configuration:** Global and scoped configs with automatic retries and proxy support.
96-
* 📚 **Model Registry:** Access 500+ models with capability detection and pricing info.
97+
## Features
98+
99+
* **Chat:** Conversational AI with `RubyLLM.chat`
100+
* **Vision:** Analyze images and screenshots
101+
* **Audio:** Transcribe and understand speech
102+
* **Documents:** Extract from PDFs, CSVs, JSON, any file type
103+
* **Image generation:** Create images with `RubyLLM.paint`
104+
* **Embeddings:** Vector search with `RubyLLM.embed`
105+
* **Tools:** Let AI call your Ruby methods
106+
* **Structured output:** JSON schemas that just work
107+
* **Streaming:** Real-time responses with blocks
108+
* **Rails:** ActiveRecord integration with `acts_as_chat`
109+
* **Async:** Fiber-based concurrency
110+
* **Model registry:** 500+ models with capability detection and pricing
111+
* **Providers:** OpenAI, Anthropic, Gemini, Bedrock, DeepSeek, Mistral, Ollama, OpenRouter, Perplexity, GPUStack, and any OpenAI-compatible API
97112

98113
## Installation
99114

@@ -103,69 +118,36 @@ gem 'ruby_llm'
103118
```
104119
Then `bundle install`.
105120

106-
Configure your API keys (using environment variables is recommended):
121+
Configure your API keys:
107122
```ruby
108-
# config/initializers/ruby_llm.rb or similar
123+
# config/initializers/ruby_llm.rb
109124
RubyLLM.configure do |config|
110-
config.openai_api_key = ENV.fetch('OPENAI_API_KEY', nil)
111-
# Add keys ONLY for providers you intend to use
112-
# config.anthropic_api_key = ENV.fetch('ANTHROPIC_API_KEY', nil)
113-
# ... see Configuration guide for all options ...
125+
config.openai_api_key = ENV['OPENAI_API_KEY']
114126
end
115127
```
116-
See the [Installation Guide](https://rubyllm.com/installation) for full details.
117128

118-
## Rails Integration
119-
120-
Add persistence to your chat models effortlessly:
129+
## Rails
121130

122131
```bash
123-
# Generate models and migrations
124132
rails generate ruby_llm:install
125133
```
126134

127135
```ruby
128-
# Or add to existing models
129136
class Chat < ApplicationRecord
130-
acts_as_chat # Automatically saves messages & tool calls
131-
end
132-
133-
class Message < ApplicationRecord
134-
acts_as_message
137+
acts_as_chat
135138
end
136139

137-
class ToolCall < ApplicationRecord
138-
acts_as_tool_call
139-
end
140-
141-
# Now chats persist automatically
142-
chat = Chat.create!(model_id: "gpt-4.1-nano")
143-
chat.ask("What's in this file?", with: "report.pdf")
140+
chat = Chat.create! model_id: "claude-sonnet-4"
141+
chat.ask "What's in this file?", with: "report.pdf"
144142
```
145143

146-
See the [Rails Integration Guide](https://rubyllm.com/guides/rails) for details.
147-
148-
## Learn More
149-
150-
Dive deeper with the official documentation:
144+
## Documentation
151145

152-
- [Installation](https://rubyllm.com/installation)
153-
- [Configuration](https://rubyllm.com/configuration)
154-
- **Guides:**
155-
- [Getting Started](https://rubyllm.com/guides/getting-started)
156-
- [Chatting with AI Models](https://rubyllm.com/guides/chat)
157-
- [Using Tools](https://rubyllm.com/guides/tools)
158-
- [Streaming Responses](https://rubyllm.com/guides/streaming)
159-
- [Rails Integration](https://rubyllm.com/guides/rails)
160-
- [Image Generation](https://rubyllm.com/guides/image-generation)
161-
- [Embeddings](https://rubyllm.com/guides/embeddings)
162-
- [Working with Models](https://rubyllm.com/guides/models)
163-
- [Error Handling](https://rubyllm.com/guides/error-handling)
164-
- [Available Models](https://rubyllm.com/guides/available-models)
146+
[rubyllm.com](https://rubyllm.com)
165147

166148
## Contributing
167149

168-
We welcome contributions! Please see [CONTRIBUTING.md](CONTRIBUTING.md) for details on setup, testing, and contribution guidelines.
150+
See [CONTRIBUTING.md](CONTRIBUTING.md).
169151

170152
## License
171153

bin/console

Lines changed: 9 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -8,20 +8,21 @@ require 'dotenv/load'
88
require 'irb'
99

1010
RubyLLM.configure do |config|
11-
config.openai_api_key = ENV.fetch('OPENAI_API_KEY', nil)
12-
config.openai_api_base = ENV.fetch('OPENAI_API_BASE', nil)
1311
config.anthropic_api_key = ENV.fetch('ANTHROPIC_API_KEY', nil)
14-
config.gemini_api_key = ENV.fetch('GEMINI_API_KEY', nil)
15-
config.deepseek_api_key = ENV.fetch('DEEPSEEK_API_KEY', nil)
16-
config.perplexity_api_key = ENV.fetch('PERPLEXITY_API_KEY', nil)
17-
config.openrouter_api_key = ENV.fetch('OPENROUTER_API_KEY', nil)
18-
config.ollama_api_base = ENV.fetch('OLLAMA_API_BASE', nil)
1912
config.bedrock_api_key = ENV.fetch('AWS_ACCESS_KEY_ID', nil)
20-
config.bedrock_secret_key = ENV.fetch('AWS_SECRET_ACCESS_KEY', nil)
2113
config.bedrock_region = ENV.fetch('AWS_REGION', nil)
14+
config.bedrock_secret_key = ENV.fetch('AWS_SECRET_ACCESS_KEY', nil)
2215
config.bedrock_session_token = ENV.fetch('AWS_SESSION_TOKEN', nil)
16+
config.deepseek_api_key = ENV.fetch('DEEPSEEK_API_KEY', nil)
17+
config.gemini_api_key = ENV.fetch('GEMINI_API_KEY', nil)
2318
config.gpustack_api_base = ENV.fetch('GPUSTACK_API_BASE', nil)
2419
config.gpustack_api_key = ENV.fetch('GPUSTACK_API_KEY', nil)
20+
config.mistral_api_key = ENV.fetch('MISTRAL_API_KEY', nil)
21+
config.ollama_api_base = ENV.fetch('OLLAMA_API_BASE', nil)
22+
config.openai_api_base = ENV.fetch('OPENAI_API_BASE', nil)
23+
config.openai_api_key = ENV.fetch('OPENAI_API_KEY', nil)
24+
config.openrouter_api_key = ENV.fetch('OPENROUTER_API_KEY', nil)
25+
config.perplexity_api_key = ENV.fetch('PERPLEXITY_API_KEY', nil)
2526
end
2627

2728
IRB.start(__FILE__)

docs/_getting_started/getting-started.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,7 @@ nav_order: 1
55
description: Start building AI apps in Ruby in 5 minutes. Chat, generate images, create embeddings - all with one gem.
66
redirect_from:
77
- /guides/getting-started
8+
- /installation
89
---
910

1011
# {{ page.title }}

0 commit comments

Comments
 (0)