Installation and Setup
Install the RubyLLM gem:Gemfile:
Configure RubyLLM for Heroku AI
RubyLLM supports OpenAI-compatible endpoints through its OpenAI provider. Configure it to point to Heroku AI:Basic Chat Completion
Use RubyLLM’s chat interface with Heroku AI models:Streaming Responses
Stream responses for faster perceived latency:Multi-turn Conversations
RubyLLM maintains conversation context automatically:Using with Rails
Configuration
Create an initializer atconfig/initializers/ruby_llm.rb:
Service Object Example
Create a service to encapsulate AI interactions:Controller Usage
Structured Output
Use RubyLLM’s structured output feature for predictable responses:Advanced Configuration
Temperature and Token Control
System Prompts
Set a system message to guide model behavior:Available Models
RubyLLM works with any Heroku AI chat model. Popular options include:claude-4-5-sonnet- Most capable, best for complex reasoningclaude-4-5-haiku- Fast and efficient, great for production workloadsclaude-4-sonnet- Latest model with extended context