Skip to content

mkranjac/rig

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Rig logo
     
  stars - rig
   

 

📑 Docs   •   🌐 Website   •   🤝 Contribute   •   ✍🏽 Blogs

✨ If you would like to help spread the word about Rig, please consider starring the repo!

Warning

Here be dragons! As we plan to ship a torrent of features in the following months, future updates will contain breaking changes. With Rig evolving, we'll annotate changes and highlight migration paths as we encounter them.

Table of contents

What is Rig?

Rig is a Rust library for building scalable, modular, and ergonomic LLM-powered applications.

More information about this crate can be found in the official & crate (API Reference) documentations.

Features

  • Agentic workflows that can handle multi-turn streaming and prompting
  • Full GenAI Semantic Convention compatibility
  • 20+ model providers, all under one singular unified interface
  • 10+ vector store integrations, all under one singular unified interface
  • Full support for LLM completion and embedding workflows
  • Support for transcription, audio generation and image generation model capabilities
  • Integrate LLMs in your app with minimal boilerplate
  • Full WASM compatibility (core library only)

Who is using Rig in production?

Below is a non-exhaustive list of companies and people who are using Rig in production:

Are you also using Rig in production? Open an issue to have your name added!

Get Started

cargo add rig-core

Simple example:

use rig::{completion::Prompt, providers::openai};

#[tokio::main]
async fn main() {
    // Create OpenAI client and model
    // This requires the `OPENAI_API_KEY` environment variable to be set.
    let openai_client = openai::Client::from_env();

    let gpt4 = openai_client.agent("gpt-4").build();

    // Prompt the model and print its response
    let response = gpt4
        .prompt("Who are you?")
        .await
        .expect("Failed to prompt GPT-4");

    println!("GPT-4: {response}");
}

Note using #[tokio::main] requires you enable tokio's macros and rt-multi-thread features or just full to enable all features (cargo add tokio --features macros,rt-multi-thread).

You can find more examples each crate's examples (ie. rig-core/examples) directory. More detailed use cases walkthroughs are regularly published on our Dev.to Blog and added to Rig's official documentation (docs.rig.rs).

Supported Integrations

Vector stores are available as separate companion-crates:

The following providers are available as separate companion-crates:

We also have some other associated crates that have additional functionality you may find helpful when using Rig:

  • rig-onchain-kit - the Rig Onchain Kit. Intended to make interactions between Solana/EVM and Rig much easier to implement.



Build by Playgrounds

About

⚙️🦀 Build portable, modular & lightweight Fullstack Agents

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Rust 99.2%
  • TypeScript 0.7%
  • Shell 0.1%
  • Nix 0.0%
  • Just 0.0%
  • Makefile 0.0%