Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions apps/web/.env.example
Original file line number Diff line number Diff line change
Expand Up @@ -85,6 +85,14 @@ LOG_ZOD_ERRORS=true
# ECONOMY_LLM_MODEL=anthropic/claude-haiku-4.5
# AI_GATEWAY_API_KEY=

# --- Azure OpenAI ---
# DEFAULT_LLM_PROVIDER=azure
# DEFAULT_LLM_MODEL=gpt-5.1
# ECONOMY_LLM_PROVIDER=azure
# ECONOMY_LLM_MODEL=gpt-5-mini
# AZURE_BASE_URL=
# AZURE_API_KEY=

# --- Groq ---
# DEFAULT_LLM_PROVIDER=groq
# DEFAULT_LLM_MODEL=llama-3.3-70b-versatile
Expand Down
3 changes: 3 additions & 0 deletions apps/web/env.ts
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ const llmProviderEnum = z.enum([
"openrouter",
"groq",
"aigateway",
"azure",
"ollama",
]);

Expand Down Expand Up @@ -58,6 +59,8 @@ export const env = createEnv({
GROQ_API_KEY: z.string().optional(),
OPENROUTER_API_KEY: z.string().optional(),
AI_GATEWAY_API_KEY: z.string().optional(),
AZURE_BASE_URL: z.string().optional(),
AZURE_API_KEY: z.string().optional(),
OLLAMA_BASE_URL: z.string().optional(),

OPENAI_ZERO_DATA_RETENTION: z.coerce.boolean().optional().default(false),
Expand Down
1 change: 1 addition & 0 deletions apps/web/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
"dependencies": {
"@ai-sdk/amazon-bedrock": "3.0.67",
"@ai-sdk/anthropic": "2.0.53",
"@ai-sdk/azure": "2.0.82",
"@ai-sdk/gateway": "2.0.18",
"@ai-sdk/google": "2.0.44",
"@ai-sdk/groq": "2.0.32",
Expand Down
1 change: 1 addition & 0 deletions apps/web/utils/llms/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ export const Provider = {
GROQ: "groq",
OPENROUTER: "openrouter",
AI_GATEWAY: "aigateway",
AZURE: "azure",
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# First, let's look at the config file to understand the full context
echo "=== Content of config.ts ==="
cat -n apps/web/utils/llms/config.ts | head -50

echo ""
echo "=== Search for providerOptions usage ==="
rg -nP --type=ts --type=tsx -C3 'providerOptions' -g '!node_modules' -g '!.next'

Repository: elie222/inbox-zero

Length of output: 1308


🏁 Script executed:

#!/bin/bash
# Search for providerOptions usage
echo "=== Search for providerOptions usage in TypeScript/React files ==="
rg -nP 'providerOptions' -g '!node_modules' -g '!.next' -C 3

echo ""
echo "=== Search for Azure environment variables or config ==="
rg -nP 'AZURE' -g '!node_modules' -g '!.next' -C 2

echo ""
echo "=== Search for Bedrock in config or usage (to compare pattern) ==="
rg -nP 'BEDROCK|bedrock' -g '!node_modules' -g '!.next' -C 2

Repository: elie222/inbox-zero

Length of output: 29822


Add Azure to the providerOptions array.

Azure is missing from the providerOptions UI dropdown (lines 19-30) despite having full backend support in model.ts (lines 142-151) and being selectable in the CLI. Users cannot currently select Azure as a provider in the web settings UI. Add Azure to providerOptions to match the pattern of other implemented providers like OpenAI, Anthropic, and Google.

🤖 Prompt for AI Agents
In apps/web/utils/llms/config.ts around lines 15–30, the providerOptions array
is missing the Azure entry even though backend support exists; add an Azure
entry to providerOptions following the same shape as the other providers (i.e.,
the same object keys used for OpenAI/Anthropic/Google such as provider
key/value/label and optional icon/meta fields) so "azure" appears in the UI
dropdown and matches backend model selection.

...(supportsOllama ? { OLLAMA: "ollama" } : {}),
};

Expand Down
14 changes: 14 additions & 0 deletions apps/web/utils/llms/model.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ import { createGoogleGenerativeAI } from "@ai-sdk/google";
import { createGroq } from "@ai-sdk/groq";
import { createOpenRouter } from "@openrouter/ai-sdk-provider";
import { createGateway } from "@ai-sdk/gateway";
import { createAzure } from "@ai-sdk/azure";
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Azure provider integration follows established patterns.

The createAzure import, Provider.AZURE case, and getProviderApiKey mapping align with how other providers (OpenAI, Google, Groq) are integrated. Defaulting to "gpt-5.1" is consistent with the OpenAI approach.

However, unlike other providers, the Azure configuration relies on two environment variables (AZURE_RESOURCE_NAME and AZURE_API_KEY) rather than a single API key. If AZURE_RESOURCE_NAME is missing or misconfigured, errors occur only at runtime. Consider adding an explicit validation check early in the provider selection logic to fail fast with a clear error message when Azure is selected but AZURE_RESOURCE_NAME is not configured.

🤖 Prompt for AI Agents
In apps/web/utils/llms/model.ts around line 9, when Provider.AZURE is selected
there is no early validation that AZURE_RESOURCE_NAME is set, which leads to
runtime failures; update the provider selection logic to check
process.env.AZURE_RESOURCE_NAME (and optionally AZURE_API_KEY) immediately when
Provider.AZURE is chosen and throw or return a clear, descriptive error if
missing so the app fails fast with guidance to set the environment variable(s).

// import { createOllama } from "ollama-ai-provider";
import { env } from "@/env";
import { Provider } from "@/utils/llms/config";
Expand Down Expand Up @@ -138,6 +139,18 @@ function selectModel(
backupModel: getBackupModel(aiApiKey),
};
}
case Provider.AZURE: {
const modelName = aiModel || "gpt-5.1";
return {
provider: Provider.AZURE,
modelName,
model: createAzure({
baseURL: env.AZURE_BASE_URL,
apiKey: aiApiKey || env.AZURE_API_KEY,
})(modelName),
backupModel: getBackupModel(aiApiKey),
};
}
case Provider.OLLAMA: {
throw new Error(
"Ollama is not supported. Revert to version v1.7.28 or older to use it.",
Expand Down Expand Up @@ -343,6 +356,7 @@ function getProviderApiKey(provider: string) {
[Provider.GROQ]: env.GROQ_API_KEY,
[Provider.OPENROUTER]: env.OPENROUTER_API_KEY,
[Provider.AI_GATEWAY]: env.AI_GATEWAY_API_KEY,
[Provider.AZURE]: env.AZURE_API_KEY,
};

return providerApiKeys[provider];
Expand Down
5 changes: 4 additions & 1 deletion docs/hosting/environment-variables.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ cp apps/web/.env.example apps/web/.env
| `UPSTASH_REDIS_TOKEN` | No* | Upstash Redis token (*required if not using Docker Compose) | — |
| `REDIS_URL` | No | Alternative Redis URL (for subscriptions) | — |
| **LLM Provider Selection** ||||
| `DEFAULT_LLM_PROVIDER` | No | Primary LLM provider (`anthropic`, `google`, `openai`, `bedrock`, `openrouter`, `groq`, `aigateway`, `ollama`) | `anthropic` |
| `DEFAULT_LLM_PROVIDER` | No | Primary LLM provider (`anthropic`, `google`, `openai`, `bedrock`, `openrouter`, `groq`, `aigateway`, `azure`, `ollama`) | `anthropic` |
| `DEFAULT_LLM_MODEL` | No | Model to use with default provider | Provider default |
| `DEFAULT_OPENROUTER_PROVIDERS` | No | Comma-separated list of OpenRouter providers | — |
| `ECONOMY_LLM_PROVIDER` | No | Provider for cheaper operations | — |
Expand All @@ -54,6 +54,9 @@ cp apps/web/.env.example apps/web/.env
| `GROQ_API_KEY` | No | Groq API key | — |
| `OPENROUTER_API_KEY` | No | OpenRouter API key | — |
| `AI_GATEWAY_API_KEY` | No | AI Gateway API key | — |
| **Azure OpenAI** ||||
| `AZURE_BASE_URL` | No | Azure OpenAI base URL (e.g., `https://your-resource.openai.azure.com/openai/v1/`) | — |
| `AZURE_API_KEY` | No | Azure OpenAI API key | — |
| **AWS Bedrock** ||||
| `BEDROCK_ACCESS_KEY` | No | AWS access key for Bedrock. See [AI SDK Bedrock documentation](https://ai-sdk.dev/providers/ai-sdk-providers/amazon-bedrock). | — |
| `BEDROCK_SECRET_KEY` | No | AWS secret key for Bedrock | — |
Expand Down
33 changes: 33 additions & 0 deletions packages/cli/src/main.ts
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Issue on line in packages/cli/src/main.ts:1:

azure creds collected in runSetup (AZURE_BASE_URL, AZURE_API_KEY) aren’t written to the .env. generateEnvFile only handles bedrock or generic API keys and never sets the Azure vars, so Azure will fail at runtime. Consider adding an azure branch that sets AZURE_BASE_URL and AZURE_API_KEY.

🚀 Reply to ask Macroscope to explain or update this suggestion.

👍 Helpful? React to give us feedback.

Original file line number Diff line number Diff line change
Expand Up @@ -388,6 +388,7 @@ Full guide: https://docs.getinboxzero.com/self-hosting/microsoft-oauth`,
label: "Vercel AI Gateway",
hint: "access multiple models",
},
{ value: "azure", label: "Azure OpenAI" },
{ value: "bedrock", label: "AWS Bedrock" },
{ value: "groq", label: "Groq", hint: "fast inference" },
],
Expand Down Expand Up @@ -415,6 +416,7 @@ Full guide: https://docs.getinboxzero.com/self-hosting/microsoft-oauth`,
default: "anthropic/claude-sonnet-4.5",
economy: "anthropic/claude-haiku-4.5",
},
azure: { default: "gpt-5.1", economy: "gpt-5.1-mini" },
bedrock: {
default: "global.anthropic.claude-sonnet-4-5-20250929-v1:0",
economy: "global.anthropic.claude-haiku-4-5-20251001-v1:0",
Expand Down Expand Up @@ -467,6 +469,37 @@ Full guide: https://docs.getinboxzero.com/self-hosting/microsoft-oauth`,
env.BEDROCK_ACCESS_KEY = bedrockCreds.accessKey;
env.BEDROCK_SECRET_KEY = bedrockCreds.secretKey;
env.BEDROCK_REGION = bedrockCreds.region || "us-west-2";
} else if (llmProvider === "azure") {
// Handle Azure separately (needs BASE_URL + API_KEY)
p.log.info(
"Get your Azure OpenAI credentials from the Azure Portal:\nhttps://portal.azure.com/",
);

const azureCreds = await p.group(
{
baseUrl: () =>
p.text({
message: "Azure Base URL",
placeholder: "https://your-resource.openai.azure.com/openai/v1/",
validate: (v) => (!v ? "Base URL is required" : undefined),
}),
apiKey: () =>
p.text({
message: "Azure API Key",
placeholder: "your-api-key",
validate: (v) => (!v ? "API key is required" : undefined),
}),
},
{
onCancel: () => {
p.cancel("Setup cancelled.");
process.exit(0);
},
},
);

env.AZURE_BASE_URL = azureCreds.baseUrl;
env.AZURE_API_KEY = azureCreds.apiKey;
} else {
const llmLinks: Record<string, string> = {
anthropic: "https://console.anthropic.com/settings/keys",
Expand Down
28 changes: 28 additions & 0 deletions pnpm-lock.yaml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion version.txt
Original file line number Diff line number Diff line change
@@ -1 +1 @@
v2.21.59
v2.21.60
Loading