-
-
Notifications
You must be signed in to change notification settings - Fork 6.9k
[Staging] - Ishaan March 17th #23903
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
34bb28d
6b2e56f
212f29f
2f7dcba
d9a6036
fde9062
05f3ad4
2c02b68
f6d53dc
cffc92b
951ecff
399a120
233b1d3
b763c87
98c1aa8
83f7481
8c23f9f
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -5,11 +5,76 @@ import Image from '@theme/IdealImage'; | |
| # Getting Started Tutorial | ||
|
|
||
| End-to-End tutorial for LiteLLM Proxy to: | ||
| - Add an Azure OpenAI model | ||
| - Make a successful /chat/completion call | ||
| - Generate a virtual key | ||
| - Set RPM limit on virtual key | ||
| - Add an Azure OpenAI model | ||
| - Make a successful /chat/completion call | ||
| - Generate a virtual key | ||
| - Set RPM limit on virtual key | ||
|
|
||
| ## Quick Install (Recommended for local / beginners) | ||
|
|
||
| New to LiteLLM? This is the easiest way to get started locally. One command installs LiteLLM and walks you through setup interactively — no config files to write by hand. | ||
|
|
||
| ### 1. Install | ||
|
|
||
| ```bash | ||
| curl -fsSL https://raw.githubusercontent.com/BerriAI/litellm/main/scripts/install.sh | sh | ||
| ``` | ||
|
|
||
| This detects your OS, installs `litellm[proxy]`, and drops you straight into the setup wizard. | ||
|
|
||
| ### 2. Follow the wizard | ||
|
|
||
| ``` | ||
| $ litellm --setup | ||
|
|
||
| Welcome to LiteLLM | ||
|
|
||
| Choose your LLM providers | ||
| ○ 1. OpenAI GPT-4o, GPT-4o-mini, o1 | ||
| ○ 2. Anthropic Claude Opus, Sonnet, Haiku | ||
| ○ 3. Azure OpenAI GPT-4o via Azure | ||
| ○ 4. Google Gemini Gemini 2.0 Flash, 1.5 Pro | ||
| ○ 5. AWS Bedrock Claude, Llama via AWS | ||
| ○ 6. Ollama Local models | ||
|
|
||
| ❯ Provider(s): 1,2 | ||
|
|
||
| ❯ OpenAI API key: sk-... | ||
| ❯ Anthropic API key: sk-ant-... | ||
|
|
||
| ❯ Port [4000]: | ||
| ❯ Master key [auto-generate]: | ||
|
|
||
| ✔ Config saved → ./litellm_config.yaml | ||
|
|
||
| ❯ Start the proxy now? (Y/n): | ||
| ``` | ||
|
|
||
| The wizard walks you through: | ||
| 1. Pick your LLM providers (OpenAI, Anthropic, Azure, Bedrock, Gemini, Ollama) | ||
| 2. Enter API keys for each provider | ||
|
Comment on lines
+30
to
+55
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
The docs present a detailed interactive wizard invoked via Users who install the package and then run |
||
| 3. Set a port and master key (or accept the defaults) | ||
| 4. Config is saved to `./litellm_config.yaml` and the proxy starts immediately | ||
|
|
||
| ### 3. Make a call | ||
|
|
||
| Your proxy is running on `http://0.0.0.0:4000`. Test it: | ||
|
|
||
| ```bash | ||
| curl -X POST 'http://0.0.0.0:4000/chat/completions' \ | ||
| -H 'Content-Type: application/json' \ | ||
| -H 'Authorization: Bearer <your-master-key>' \ | ||
| -d '{ | ||
| "model": "gpt-4o", | ||
| "messages": [{"role": "user", "content": "Hello!"}] | ||
| }' | ||
| ``` | ||
|
|
||
| :::tip Already have pip installed? | ||
| You can skip the curl install and run `litellm --setup` directly after `pip install 'litellm[proxy]'`. | ||
| ::: | ||
|
|
||
| --- | ||
|
|
||
| ## Pre-Requisites | ||
|
|
||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
scripts/install.shdoes not existThe install command instructs users to run:
curl -fsSL https://raw.githubusercontent.com/BerriAI/litellm/main/scripts/install.sh | shHowever,
scripts/install.shdoes not exist anywhere in this repository. A user following this documentation will get acurlerror (the-fflag causes a silent failure on HTTP 404), resulting in nothing being installed with no useful error message to diagnose the problem.This entire "Quick Install" section should only be merged after the referenced
scripts/install.shis created and committed.