diff --git a/README.md b/README.md index 7d6b7f4bb1..13aba7843f 100644 --- a/README.md +++ b/README.md @@ -4,7 +4,9 @@ Bifrost is an open-source middleware that serves as a unified gateway to various AI model providers, enabling seamless integration and fallback mechanisms for your AI-powered applications. -## ⚡ Quickstart +![Bifrost](./docs/media/cover.png) + +## ⚡ Quickstart (30 seconds) ### Prerequisites @@ -12,7 +14,7 @@ Bifrost is an open-source middleware that serves as a unified gateway to various - Access to at least one AI model provider (OpenAI, Anthropic, etc.) - API keys for the providers you wish to use -### A. Using Bifrost as an HTTP Server +### Using Bifrost HTTP Transport 1. **Create `config.json`**: This file should contain your provider settings and API keys. @@ -36,7 +38,6 @@ Bifrost is an open-source middleware that serves as a unified gateway to various ```bash export OPENAI_API_KEY=your_openai_api_key - export ANTHROPIC_API_KEY=your_anthropic_api_key ``` Note: Ensure you add all variables stated in your `config.json` file. @@ -73,7 +74,6 @@ Bifrost is an open-source middleware that serves as a unified gateway to various docker run -p 8080:8080 \ -v $(pwd)/config.json:/app/config/config.json \ -e OPENAI_API_KEY \ - -e ANTHROPIC_API_KEY \ maximhq/bifrost ``` @@ -88,93 +88,29 @@ Bifrost is an open-source middleware that serves as a unified gateway to various "provider": "openai", "model": "gpt-4o-mini", "messages": [ - {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Tell me about Bifrost in Norse mythology."} ] }' ``` -For additional HTTP server configuration options, read [this](https://github.com/maximhq/bifrost/blob/main/transports/README.md). - -### B. Using Bifrost as a Go Package - -1. **Implement Your Account Interface**: First, create an account that follows [Bifrost's account interface](https://github.com/maximhq/bifrost/blob/main/core/schemas/account.go). - - ```golang - type BaseAccount struct{} - - func (baseAccount *BaseAccount) GetConfiguredProviders() ([]schemas.ModelProvider, error) { - return []schemas.ModelProvider{schemas.OpenAI}, nil - } - - func (baseAccount *BaseAccount) GetKeysForProvider(providerKey schemas.ModelProvider) ([]schemas.Key, error) { - return []schemas.Key{ - { - Value: os.Getenv("OPENAI_API_KEY"), - Models: []string{"gpt-4o-mini"}, - Weight: 1.0, - }, - }, nil - } - - func (baseAccount *BaseAccount) GetConfigForProvider(providerKey schemas.ModelProvider) (*schemas.ProviderConfig, error) { - return &schemas.ProviderConfig{ - NetworkConfig: schemas.DefaultNetworkConfig, - ConcurrencyAndBufferSize: schemas.DefaultConcurrencyAndBufferSize, - }, nil - } - ``` - - Bifrost uses these methods to get all the keys and configurations it needs to call the providers. See the [Additional Configurations](#additional-configurations) section for additional customization options. - -2. **Initialize Bifrost**: Set up the Bifrost instance by providing your account implementation. - - ```golang - account := BaseAccount{} - - client, err := bifrost.Init(schemas.BifrostConfig{ - Account: &account, - }) - ``` - -3. **Use Bifrost**: Make your First LLM Call! - - ```golang - bifrostResult, bifrostErr := bifrost.ChatCompletionRequest( - context.Background(), - &schemas.BifrostRequest{ - Provider: schemas.OpenAI, - Model: "gpt-4o-mini", // make sure you have configured gpt-4o-mini in your account interface - Input: schemas.RequestInput{ - ChatCompletionInput: bifrost.Ptr([]schemas.BifrostMessage{{ - Role: schemas.ModelChatMessageRoleUser, - Content: schemas.MessageContent{ - ContentStr: bifrost.Ptr("What is a LLM gateway?"), - }, - }}), - }, - }, - ) - ``` + **That's it!**, just _4 lines of code_ and you can now use Bifrost to make requests to any provider you have configured. - You can add model parameters by including `Params: &schemas.ModelParameters{...yourParams}` in ChatCompletionRequest. + > For additional HTTP server configuration options, read [this](https://github.com/maximhq/bifrost/blob/main/transports/README.md). ## 📑 Table of Contents - [Bifrost](#bifrost) - - [⚡ Quickstart](#-quickstart) + - [⚡ Quickstart (30 seconds)](#-quickstart-30-seconds) - [Prerequisites](#prerequisites) - - [A. Using Bifrost as an HTTP Server](#a-using-bifrost-as-an-http-server) + - [Using Bifrost HTTP Transport](#using-bifrost-http-transport) - [i) Using Go Binary](#i-using-go-binary) - [ii) OR Using Docker](#ii-or-using-docker) - - [B. Using Bifrost as a Go Package](#b-using-bifrost-as-a-go-package) - [📑 Table of Contents](#-table-of-contents) - - [🔍 Overview](#-overview) - [✨ Features](#-features) - [🏗️ Repository Structure](#️-repository-structure) - [🚀 Getting Started](#-getting-started) - - [Package Structure](#package-structure) - - [Additional Configurations](#additional-configurations) + - [1. As a Go Package (Core Integration)](#1-as-a-go-package-core-integration) + - [2. As an HTTP API (Transport Layer)](#2-as-an-http-api-transport-layer) - [📊 Benchmarks](#-benchmarks) - [Test Environment](#test-environment) - [1. t3.medium(2 vCPUs, 4GB RAM)](#1-t3medium2-vcpus-4gb-ram) @@ -186,20 +122,6 @@ For additional HTTP server configuration options, read [this](https://github.com --- -## 🔍 Overview - -Bifrost acts as a bridge between your applications and multiple AI providers (OpenAI, Anthropic, Amazon Bedrock, Mistral, Ollama, etc.). It provides a consistent API while handling: - -- Authentication and key management -- Request routing and load balancing -- Fallback mechanisms for reliability -- Unified request and response formatting -- Connection pooling and concurrency control - -With Bifrost, you can focus on building your AI-powered applications without worrying about the underlying provider-specific implementations. It handles all the complexities of key and provider management, providing a fixed input and output format so you don't need to modify your codebase for different providers. - ---- - ## ✨ Features - **Multi-Provider Support**: Integrate with OpenAI, Anthropic, Amazon Bedrock, Mistral, Ollama, and more through a single API @@ -212,6 +134,8 @@ With Bifrost, you can focus on building your AI-powered applications without wor - **MCP Integration**: Built-in Model Context Protocol (MCP) support for external tool integration and execution - **Custom Configuration**: Offers granular control over pool sizes, network retry settings, fallback providers, and network proxy configurations - **Built-in Observability**: Native Prometheus metrics out of the box, no wrappers, no sidecars, just drop it in and scrape +- **SDK Support**: Bifrost is available as a Go package, so you can use it directly in your own applications. +- **Seamless Integration with Generative AI SDKs**: Effortlessly transition to Bifrost by simply updating the `base_url` in your existing SDKs, such as OpenAI, Anthropic, GenAI, and more. Just one line of code is all it takes to make the switch. --- @@ -233,7 +157,7 @@ bifrost/ │ └── ... │ ├── transports/ # Interface layers (HTTP, gRPC, etc.) -│ ├── bifrost-http/ # HTTP transport implementation +│ ├── bifrost-http/ # HTTP transport implementation │ └── ... │ └── plugins/ # Plugin Implementations @@ -247,40 +171,34 @@ The system uses a provider-agnostic approach with well-defined interfaces to eas ## 🚀 Getting Started -If you want to **set up the Bifrost API quickly**, [check the transports documentation](https://github.com/maximhq/bifrost/tree/main/transports/README.md). - -### Package Structure +There are two main ways to use Bifrost: -Bifrost is divided into three Go packages: core, plugins, and transports. +### 1. As a Go Package (Core Integration) -1. **core**: This package contains the core implementation of Bifrost as a Go package. -2. **plugins**: This package serves as an extension to core. You can download individual packages using `go get github.com/maximhq/bifrost/plugins/{plugin-name}` and pass the plugins while initializing Bifrost. +For direct integration into your Go applications, use Bifrost as a package. This provides the most flexibility and control over your AI model interactions. -```golang -// go get github.com/maximhq/bifrost/plugins/maxim +> **📖 [Complete Core Package Documentation](./docs/core-package.md)** -maximPlugin, err := maxim.NewMaximLoggerPlugin(os.Getenv("MAXIM_API_KEY"), os.Getenv("MAXIM_LOGGER_ID")) -if err != nil { - return nil, err -} +Quick example: -// Initialize Bifrost -client, err := bifrost.Init(schemas.BifrostConfig{ - Account: &account, - Plugins: []schemas.Plugin{maximPlugin}, -}) +```bash +go get github.com/maximhq/bifrost/core ``` -3. **transports**: This package contains transport clients like HTTP to expose your Bifrost client. You can either `go get` this package or directly use the independent Dockerfile to quickly spin up your [Bifrost API](https://github.com/maximhq/bifrost/tree/main/transports/README.md) (read more on this). +### 2. As an HTTP API (Transport Layer) -### Additional Configurations +For quick setup and language-agnostic integration, use the HTTP transport layer. -- [Memory Management](https://github.com/maximhq/bifrost/blob/main/docs/memory-management.md) -- [Logger](https://github.com/maximhq/bifrost/blob/main/docs/logger.md) -- [Plugins](https://github.com/maximhq/bifrost/blob/main/docs/plugins.md) -- [Provider Configurations](https://github.com/maximhq/bifrost/blob/main/docs/providers.md) -- [Fallbacks](https://github.com/maximhq/bifrost/blob/main/docs/fallbacks.md) -- [MCP Integration](https://github.com/maximhq/bifrost/blob/main/docs/mcp.md) +> **📖 [Complete HTTP Transport Documentation](./transports/README.md)** + +Quick example: + +```bash +docker run -p 8080:8080 \ + -v $(pwd)/config.json:/app/config/config.json \ + -e OPENAI_API_KEY \ + maximhq/bifrost +``` --- diff --git a/docs/core-package.md b/docs/core-package.md new file mode 100644 index 0000000000..b9da76dce0 --- /dev/null +++ b/docs/core-package.md @@ -0,0 +1,208 @@ +# Bifrost Core Package Documentation + +This guide covers how to use Bifrost as a Go package in your applications, providing direct integration without the need for external transports. + +![Bifrost Package Demo](./media/package-demo.mp4) + +## 📑 Table of Contents + +- [Bifrost Core Package Documentation](#bifrost-core-package-documentation) + - [📑 Table of Contents](#-table-of-contents) + - [Package Structure](#package-structure) + - [Getting Started](#getting-started) + - [Basic Usage](#basic-usage) + - [Implementing Your Account Interface](#implementing-your-account-interface) + - [Initializing Bifrost](#initializing-bifrost) + - [Making Your First LLM Call](#making-your-first-llm-call) + - [Advanced Configuration](#advanced-configuration) + - [Additional Features](#additional-features) + - [🧠 Memory Management](#-memory-management) + - [📝 Logger](#-logger) + - [🔌 Plugins](#-plugins) + - [⚙️ Provider Configurations](#️-provider-configurations) + - [🔄 Fallbacks](#-fallbacks) + - [🛠️ MCP Integration](#️-mcp-integration) + - [Next Steps](#next-steps) + +--- + +## Package Structure + +Bifrost is built with a modular architecture where the core functionality is separated from transport layers: + +```text +bifrost/ +├── core/ # Core functionality and shared components +│ ├── providers/ # Provider-specific implementations +│ ├── schemas/ # Interfaces and structs used in bifrost +│ ├── bifrost.go # Main Bifrost implementation +│ ├── logger.go # Logging functionality +│ ├── mcp.go # Model Context Protocol support +│ └── utils.go # Utility functions +``` + +All interfaces are defined in `core/schemas/` and can be used as a reference for contributions and custom implementations. + +--- + +## Getting Started + +To use Bifrost as a Go package in your application: + +```bash +go get github.com/maximhq/bifrost/core +``` + +--- + +## Basic Usage + +### Implementing Your Account Interface + +First, create an account that follows [Bifrost's account interface](https://github.com/maximhq/bifrost/blob/main/core/schemas/account.go): + +```golang +package main + +import ( + "os" + "github.com/maximhq/bifrost/core/schemas" +) + +type BaseAccount struct{} + +func (baseAccount *BaseAccount) GetConfiguredProviders() ([]schemas.ModelProvider, error) { + return []schemas.ModelProvider{schemas.OpenAI}, nil +} + +func (baseAccount *BaseAccount) GetKeysForProvider(providerKey schemas.ModelProvider) ([]schemas.Key, error) { + return []schemas.Key{ + { + Value: os.Getenv("OPENAI_API_KEY"), + Models: []string{"gpt-4o-mini"}, + Weight: 1.0, + }, + }, nil +} + +func (baseAccount *BaseAccount) GetConfigForProvider(providerKey schemas.ModelProvider) (*schemas.ProviderConfig, error) { + return &schemas.ProviderConfig{ + NetworkConfig: schemas.DefaultNetworkConfig, + ConcurrencyAndBufferSize: schemas.DefaultConcurrencyAndBufferSize, + }, nil +} +``` + +Bifrost uses these methods to get all the keys and configurations it needs to call the providers. + +### Initializing Bifrost + +Set up the Bifrost instance by providing your account implementation: + +```golang +package main + +import ( + "context" + "github.com/maximhq/bifrost/core" + "github.com/maximhq/bifrost/core/schemas" +) + +func main() { + account := BaseAccount{} + + client, err := bifrost.Init(schemas.BifrostConfig{ + Account: &account, + }) + if err != nil { + panic(err) + } +} +``` + +### Making Your First LLM Call + +```golang +bifrostResult, bifrostErr := client.ChatCompletionRequest( + context.Background(), + &schemas.BifrostRequest{ + Provider: schemas.OpenAI, + Model: "gpt-4o-mini", // make sure you have configured gpt-4o-mini in your account interface + Input: schemas.RequestInput{ + ChatCompletionInput: bifrost.Ptr([]schemas.BifrostMessage{{ + Role: schemas.ModelChatMessageRoleUser, + Content: schemas.MessageContent{ + ContentStr: bifrost.Ptr("What is a LLM gateway?"), + }, + }}), + }, + }, +) + +if bifrostErr != nil { + panic(bifrostErr) +} + +// Handle the response +fmt.Println(bifrostResult.Response) +``` + +You can add model parameters by including `Params: &schemas.ModelParameters{...yourParams}` in ChatCompletionRequest. + +--- + +## Advanced Configuration + +Bifrost offers extensive configuration options to customize behavior for your specific needs. You can configure various aspects through the account interface and initialization parameters. + +For detailed configuration options, see the [Provider Configurations](./providers.md) documentation. + +--- + +## Additional Features + +Bifrost provides several advanced features to enhance your AI application development: + +### 🧠 Memory Management + +Optimize memory usage and performance with configurable buffer sizes and connection pooling. + +- **Documentation**: [Memory Management](./memory-management.md) + +### 📝 Logger + +Built-in logging system with configurable levels and output formats. + +- **Documentation**: [Logger](./logger.md) + +### 🔌 Plugins + +Extend Bifrost functionality with custom plugins using the plugin-first architecture. + +- **Documentation**: [Plugins](./plugins.md) + +### ⚙️ Provider Configurations + +Fine-tune provider-specific settings including retry logic, timeouts, and concurrency limits. + +- **Documentation**: [Provider Configurations](./providers.md) + +### 🔄 Fallbacks + +Implement robust fallback mechanisms for high availability across multiple providers and models. + +- **Documentation**: [Fallbacks](./fallbacks.md) + +### 🛠️ MCP Integration + +Leverage Model Context Protocol (MCP) for external tool integration and execution. + +- **Documentation**: [MCP Integration](./mcp.md) + +--- + +## Next Steps + +- Explore the [HTTP Transport](../transports/README.md) for API-based integration +- Check out [example implementations](../tests/core-chatbot/) for real-world usage patterns +- Review the [system architecture](./system-architecture.md) for understanding Bifrost's internal design diff --git a/docs/media/cover.png b/docs/media/cover.png new file mode 100644 index 0000000000..7b5f4c52a4 Binary files /dev/null and b/docs/media/cover.png differ diff --git a/docs/media/package-demo.mp4 b/docs/media/package-demo.mp4 new file mode 100644 index 0000000000..a7651c07cb Binary files /dev/null and b/docs/media/package-demo.mp4 differ