Skip to content

Commit 8a84f97

Browse files
authored
Merge pull request #1 from ServiceStack/claude/incorporate-du-01KDhm65HHQaoR5vpU2HWP7i
Incorporate the DU component
2 parents faeaa5d + 4a0d225 commit 8a84f97

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

46 files changed

+9404
-23
lines changed

content/docs/cli/index.mdx

Lines changed: 408 additions & 0 deletions
Large diffs are not rendered by default.

content/docs/cli/meta.json

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
{
2+
"title": "CLI Reference",
3+
"pages": [
4+
"index"
5+
]
6+
}
Lines changed: 216 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,216 @@
1+
---
2+
title: Configuration
3+
description: Configure llms.py providers, models, and settings
4+
---
5+
6+
# Configuration
7+
8+
llms.py uses a simple JSON configuration file to manage providers, models, and defaults.
9+
10+
## Configuration File
11+
12+
The main configuration file is `~/.llms/llms.json`. It's automatically created on first run with sensible defaults.
13+
14+
### Initialize Configuration
15+
16+
```bash
17+
llms --init
18+
```
19+
20+
This creates `~/.llms/llms.json` with the latest default configuration.
21+
22+
### Configuration Structure
23+
24+
```json
25+
{
26+
"defaults": {
27+
"headers": { ... },
28+
"text": { ... },
29+
"image": { ... },
30+
"audio": { ... },
31+
"file": { ... }
32+
},
33+
"providers": {
34+
"groq": { ... },
35+
"openai": { ... },
36+
...
37+
}
38+
}
39+
```
40+
41+
## Defaults Section
42+
43+
### Headers
44+
45+
Common HTTP headers for all requests:
46+
47+
```json
48+
{
49+
"defaults": {
50+
"headers": {
51+
"Content-Type": "application/json"
52+
}
53+
}
54+
}
55+
```
56+
57+
### Chat Templates
58+
59+
Default request templates for different modalities:
60+
61+
```json
62+
{
63+
"defaults": {
64+
"text": {
65+
"model": "grok-4-fast",
66+
"messages": [
67+
{"role": "user", "content": ""}
68+
],
69+
"temperature": 0.7
70+
},
71+
"image": {
72+
"model": "gemini-2.5-flash",
73+
"messages": [ ... ]
74+
},
75+
"audio": {
76+
"model": "gpt-4o-audio-preview",
77+
"messages": [ ... ]
78+
},
79+
"file": {
80+
"model": "gpt-5",
81+
"messages": [ ... ]
82+
}
83+
}
84+
}
85+
```
86+
87+
### Conversion Settings
88+
89+
Image conversion and limits:
90+
91+
```json
92+
{
93+
"convert": {
94+
"max_image_size": 2048,
95+
"max_image_length": 20971520,
96+
"webp_quality": 90
97+
}
98+
}
99+
```
100+
101+
## Providers Section
102+
103+
Each provider requires specific configuration. See the [Providers](/docs/features/providers) page for details.
104+
105+
### Basic Provider
106+
107+
```json
108+
{
109+
"providers": {
110+
"groq": {
111+
"enabled": true,
112+
"type": "OpenAiProvider",
113+
"base_url": "https://api.groq.com/openai",
114+
"api_key": "$GROQ_API_KEY",
115+
"models": {
116+
"llama3.3:70b": "llama-3.3-70b-versatile"
117+
},
118+
"pricing": {
119+
"llama3.3:70b": {
120+
"input": 0.40,
121+
"output": 1.20
122+
}
123+
}
124+
}
125+
}
126+
}
127+
```
128+
129+
## UI Configuration
130+
131+
The UI configuration is stored in `~/.llms/ui.json`:
132+
133+
```json
134+
{
135+
"prompts": [
136+
{
137+
"id": "it-expert",
138+
"name": "Act as an IT Expert",
139+
"value": "I want you to act as an IT Expert..."
140+
}
141+
],
142+
"defaultModel": "grok-4-fast",
143+
"theme": "auto"
144+
}
145+
```
146+
147+
## Environment Variables
148+
149+
### API Keys
150+
151+
Reference environment variables in config with `$` prefix:
152+
153+
```json
154+
{
155+
"api_key": "$GROQ_API_KEY"
156+
}
157+
```
158+
159+
Set in your shell:
160+
161+
```bash
162+
export GROQ_API_KEY="gsk_..."
163+
```
164+
165+
### Other Settings
166+
167+
```bash
168+
# Enable verbose logging
169+
export VERBOSE=1
170+
```
171+
172+
## Custom Configuration Path
173+
174+
Use a custom configuration file:
175+
176+
```bash
177+
llms --config /path/to/custom-config.json "Hello"
178+
```
179+
180+
## Configuration Management
181+
182+
### View Configuration
183+
184+
```bash
185+
# List providers and models
186+
llms --list
187+
188+
# List specific providers
189+
llms ls groq anthropic
190+
```
191+
192+
### Enable/Disable Providers
193+
194+
```bash
195+
# Enable providers
196+
llms --enable groq openai
197+
198+
# Disable providers
199+
llms --disable ollama
200+
```
201+
202+
### Set Default Model
203+
204+
```bash
205+
llms --default grok-4-fast
206+
```
207+
208+
This updates `defaults.text.model` in the config file.
209+
210+
## Next Steps
211+
212+
<Cards>
213+
<Card title="Providers" href="/docs/features/providers" />
214+
<Card title="CLI Reference" href="/docs/cli" />
215+
<Card title="Features" href="/docs/features" />
216+
</Cards>
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
{
2+
"title": "Configuration",
3+
"pages": [
4+
"index"
5+
]
6+
}

0 commit comments

Comments
 (0)