Releases: crmne/ruby_llm
Releases · crmne/ruby_llm
1.0.1
🚀 Improvements & Bug Fixes
This release has some important fixes and quality-of-life improvements:
- Fixed temperature handling for OpenAI's o1 and o3 models (they require a temperature of 1.0, which we now normalize automatically)
- Better tool naming with proper stripping of unsupported characters for OpenAI
- Improved model capabilities detection for vision and function support
- Added base64 to dependencies for Ruby 3.4.0+ compatibility
- Enhanced Rails integration documentation with system message examples
- Added VCR support for testing (because integration tests shouldn't require real API calls)
- Fixed model refresh bug - no more stale model lists when you call
RubyLLM.models.refresh!
📄 Documentation Enhancements
We've enhanced our documentation across the board:
- Added a section on system prompts to the chat guide
- Improved the contributing guidelines with model naming conventions
- Better examples of streaming responses in the README
🧪 Testing Improvements
- Added comprehensive specs for model filtering and chat functionalities
- Better CI/CD workflows for version validation
🏆 Shoutouts
Thanks to our contributors for this release: @dalthon, @jaggdl, @Thomascountz, @stevegeek, @jeduardo824, @redox, and @mhmaguire. This is what makes open source great.
New Contributors
- @mhmaguire made their first contribution in #59
- @Thomascountz made their first contribution in #60
- @jaggdl made their first contribution in #56
- @dalthon made their first contribution in #57
- @redox made their first contribution in #38
- @stevegeek made their first contribution in #28
- @jeduardo824 made their first contribution in #54
Full Changelog: 1.0.0...1.0.1
1.0.0
RubyLLM 1.0.0
A beautiful way to work with AI in Ruby. RubyLLM a unified interface to modern AI models from multiple providers with an elegant, Ruby-like API.
Features
- Unified API for OpenAI, Anthropic Claude, Google Gemini, and DeepSeek
- Simple conversation interface with automatic history tracking
- Consistent streaming that works the same way across all providers
- Built-in token tracking for cost management
- Tool integration with a clean, Ruby-like interface
- Rails integration with ActiveRecord persistence via
acts_as_chat
- Multimodal support for images, PDFs, and audio
- Embeddings for vector search and semantic analysis
- Image generation with DALL-E and other providers
- Comprehensive error handling with specific error types
- Minimal dependencies for a lightweight footprint
Installation
# In your Gemfile
gem 'ruby_llm'
# Or install directly
gem install ruby_llm
Quick Start
# Configure with API keys
RubyLLM.configure do |config|
config.openai_api_key = ENV['OPENAI_API_KEY']
config.anthropic_api_key = ENV['ANTHROPIC_API_KEY']
config.gemini_api_key = ENV['GEMINI_API_KEY']
end
# Start chatting
chat = RubyLLM.chat
chat.ask "What's the best way to learn Ruby?"
# Or generate an image
image = RubyLLM.paint("a sunset over mountains")
image.save("sunset.png")
# Or create embeddings
embedding = RubyLLM.embed("Ruby is a programmer's best friend")
Documentation
Full documentation is available at rubyllm.com
License
Released under the MIT License.