t1014: ai-research OAuth auth from auth.json#1475
Conversation
…ANTHROPIC_API_KEY Read OAuth tokens from ~/.local/share/opencode/auth.json (primary path), with ANTHROPIC_API_KEY as fallback. Uses anthropic-beta: oauth-2025-04-20 header for direct Anthropic API access. Handles token refresh automatically. Removes the OpenCode local API approach (port scanning, session creation) which required opencode serve to be running separately.
WalkthroughThe pull request replaces direct Anthropic API access in Changes
Sequence Diagram(s)sequenceDiagram
participant Caller
participant readAuthFile as readAuthFile()
participant refreshOAuth as refreshOAuthToken()
participant resolveAuth as resolveAuth()
participant callAnthropic as callAnthropic()
participant AnthropicAPI as Anthropic API
Caller->>resolveAuth: research(prompt, model)
resolveAuth->>readAuthFile: Load auth credentials
readAuthFile-->>resolveAuth: {oauthToken, apiKey} or env vars
alt OAuth Token Path
resolveAuth->>refreshOAuth: Check & refresh token if needed
refreshOAuth->>AnthropicAPI: POST /refresh (if expired)
AnthropicAPI-->>refreshOAuth: {accessToken, expiresIn}
refreshOAuth->>readAuthFile: Persist refreshed token
readAuthFile-->>refreshOAuth: auth.json saved
refreshOAuth-->>resolveAuth: accessToken
else API Key Fallback
resolveAuth-->>resolveAuth: Use apiKey from auth.json or env
end
resolveAuth->>callAnthropic: Call with resolved credentials
callAnthropic->>callAnthropic: Build headers (Bearer or x-api-key)
callAnthropic->>AnthropicAPI: POST /messages (with auth header)
AnthropicAPI-->>callAnthropic: Response
callAnthropic-->>Caller: Result
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 3 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
🧪 Generate unit tests (beta)
⚔️ Resolve merge conflicts (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Summary of ChangesHello @marcusquinn, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request significantly refactors the Anthropic API authentication mechanism, shifting from a reliance on environment variables or a local OpenCode API to a more robust system prioritizing OAuth tokens stored in Highlights
Changelog
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
🔍 Code Quality Report�[0;35m[MONITOR]�[0m Code Review Monitoring Report �[0;34m[INFO]�[0m Latest Quality Status: �[0;34m[INFO]�[0m Recent monitoring activity: 📈 Current Quality Metrics
Generated on: Sat Feb 14 13:54:29 UTC 2026 Generated by AI DevOps Framework Code Review Monitoring |
|
There was a problem hiding this comment.
Code Review
This pull request introduces a significant improvement by refactoring the authentication mechanism to support OAuth tokens from auth.json, with a fallback to the ANTHROPIC_API_KEY environment variable. The changes are well-structured, separating authentication logic into dedicated functions, which enhances code clarity and maintainability. I have a couple of suggestions to further improve error handling and code maintainability.
| const key = process.env.ANTHROPIC_API_KEY | ||
| if (key) return key | ||
| const AUTH_FILE = `${process.env.HOME || "~"}/.local/share/opencode/auth.json` | ||
| const OAUTH_CLIENT_ID = "9d1c250a-e61b-44d9-88ed-5944d1962f5e" |
There was a problem hiding this comment.
To improve maintainability and avoid magic strings, it's a good practice to define all API endpoint URLs as constants. You can define the OAuth token URL and the Anthropic API URL here, and then use these constants in refreshOAuthToken and callAnthropic.
const OAUTH_CLIENT_ID = "9d1c250a-e61b-44d9-88ed-5944d1962f5e"
const OAUTH_TOKEN_URL = "https://console.anthropic.com/v1/oauth/token"
const ANTHROPIC_API_URL = "https://api.anthropic.com/v1/messages"| } catch { | ||
| // Non-fatal: token still works for this request even if we can't persist | ||
| } |
There was a problem hiding this comment.
This empty catch block silently ignores errors when persisting the refreshed OAuth token. While this is non-fatal for the current request, it can lead to repeated token refreshes on every subsequent execution if there's a persistent file write issue (e.g., incorrect permissions). This is inefficient and could lead to rate-limiting. Consider logging a warning to inform the user about the persistence failure.
} catch (e) {
// Non-fatal: token still works for this request even if we can't persist
console.warn(`[ai-research] Failed to persist refreshed OAuth token: ${e instanceof Error ? e.message : String(e)}`)
}There was a problem hiding this comment.
Actionable comments posted: 3
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
.opencode/lib/ai-research.ts (1)
443-457:⚠️ Potential issue | 🟠 MajorAPI fetch call has no timeout — can hang indefinitely.
Same pattern as the token refresh: no
AbortSignal.timeout()on the Anthropic Messages API call. For a sub-worker that's meant to be lightweight, an unbounded hang is particularly problematic as it stalls the caller's workflow.🛡️ Suggested fix
const response = await fetch(url, { method: "POST", headers, body: JSON.stringify({ model: modelId, max_tokens: maxTokens, system: systemPrompt, messages: [{ role: "user", content: req.prompt }], }), + signal: AbortSignal.timeout(30_000), })
🤖 Fix all issues with AI agents
In @.opencode/lib/ai-research.ts:
- Around line 325-365: The refreshOAuthToken function currently can hang and may
persist invalid tokens; add a fetch timeout (use AbortController with a
reasonable timeout) when calling
fetch("https://console.anthropic.com/v1/oauth/token") and abort the request on
timeout, and after parsing the JSON validate that json.access_token and
json.refresh_token (and expires_in) are present and of the expected types; if
validation fails, throw a clear Error (do not return undefined) and avoid
writing to AUTH_FILE; only update Bun.file(AUTH_FILE) / Bun.write(AUTH_FILE,
...) when validated fields exist, and keep the existing try/catch around the
write so write failures remain non-fatal but log or include the underlying error
in the thrown or logged message where appropriate (referencing
refreshOAuthToken, AUTH_FILE, OAUTH_CLIENT_ID, Bun.file, Bun.write).
- Around line 385-391: The token expiry check in the oauth branch (the block
that returns { method: "oauth", token: accessToken }) compares auth.expires to
Date.now() with no margin; change the condition to proactively refresh when the
token is near expiry by using a buffer (e.g., 60_000 ms). Specifically, in the
code that reads auth?.type === "oauth", update the expiry check from
auth.expires < Date.now() to something like auth.expires < Date.now() +
EXPIRY_BUFFER (or auth.expires - EXPIRY_BUFFER < Date.now()) and define
EXPIRY_BUFFER = 60_000 (or similar) in scope so refreshOAuthToken(auth) is
called before the token is about to expire.
- Line 45: The haiku model entry (haiku: "claude-3-haiku-20240307") is pinned to
an older Claude 3.0 version while other models (sonnet, opus) use 4-20250514;
either update the haiku value to the intended current model (e.g.,
claude-3.5-haiku or the matching 4-2025.. identifier) or add a clarifying
comment/docstring explaining why the downgrade is intentional for OAuth/beta
compatibility. Locate the model mapping where the haiku key is defined in
ai-research.ts (the haiku property) and change the string to the correct model
ID or insert a one-line comment above the haiku entry stating the compatibility
reason and date/version, so the inconsistency is explicit.
🧹 Nitpick comments (1)
.opencode/lib/ai-research.ts (1)
310-319: No shape validation on the auth.jsonanthropicentry.
readAuthFiletrusts the JSON structure blindly. If auth.json is malformed (e.g.,type: "oauth"but missingrefreshorexpires), this will silently pass a broken object torefreshOAuthToken, which will POST anundefinedrefresh token to the OAuth endpoint. Consider a lightweight guard:🛡️ Suggested validation
const data = await file.json() - return data.anthropic || null + const entry = data.anthropic + if (!entry || typeof entry !== "object" || !entry.type) return null + if (entry.type === "oauth" && (!entry.refresh || typeof entry.expires !== "number")) return null + if (entry.type === "api" && !entry.key) return null + return entry as AuthEntry
|
|
||
| const MODEL_MAP: Record<string, string> = { | ||
| haiku: "claude-3-5-haiku-20241022", | ||
| haiku: "claude-3-haiku-20240307", |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# First, verify the file exists and read the MODEL_MAP section
if [ -f ".opencode/lib/ai-research.ts" ]; then
echo "=== File found. Reading MODEL_MAP section (lines 40-55) ==="
head -n 55 ".opencode/lib/ai-research.ts" | tail -n 20
else
echo "File not found at .opencode/lib/ai-research.ts"
find . -name "ai-research.ts" -type f 2>/dev/null
fiRepository: marcusquinn/aidevops
Length of output: 740
🏁 Script executed:
#!/bin/bash
# Search for any comments or context around the model choice
rg -A 5 -B 5 "claude-3.*haiku" .opencode/lib/ai-research.tsRepository: marcusquinn/aidevops
Length of output: 398
🏁 Script executed:
#!/bin/bash
# Check git history for this specific line if .git exists
if [ -d ".git" ]; then
echo "=== Git blame for the haiku line ==="
git blame -L 40,50 ".opencode/lib/ai-research.ts" 2>/dev/null || echo "Git blame not available"
echo ""
echo "=== Recent commits to this file ==="
git log --oneline -n 10 -- ".opencode/lib/ai-research.ts" 2>/dev/null || echo "Git log not available"
else
echo "No .git directory found"
fiRepository: marcusquinn/aidevops
Length of output: 1139
🏁 Script executed:
#!/bin/bash
# Search for any comments mentioning OAuth, model selection, or versioning decisions
rg -i "oauth|model.*version|3\.5|3\.0|haiku.*reason|claude.*downgrade|claude.*downgrade" ".opencode/lib/ai-research.ts" -A 2 -B 2Repository: marcusquinn/aidevops
Length of output: 2673
🏁 Script executed:
#!/bin/bash
# Check the entire MODEL_MAP to understand the pattern
rg -A 20 "MODEL_MAP\s*=" ".opencode/lib/ai-research.ts"Repository: marcusquinn/aidevops
Length of output: 46
Clarify rationale for haiku model downgrade — version pattern inconsistency
The haiku model is pinned to Claude 3.0 (claude-3-haiku-20240307), while sonnet and opus use the latest versions (4-20250514). If this downgrade is required for OAuth/beta compatibility, document it. Otherwise, Claude 3.5 Haiku offers better quality at the same tier. This version inconsistency should be intentional and explained.
🤖 Prompt for AI Agents
In @.opencode/lib/ai-research.ts at line 45, The haiku model entry (haiku:
"claude-3-haiku-20240307") is pinned to an older Claude 3.0 version while other
models (sonnet, opus) use 4-20250514; either update the haiku value to the
intended current model (e.g., claude-3.5-haiku or the matching 4-2025..
identifier) or add a clarifying comment/docstring explaining why the downgrade
is intentional for OAuth/beta compatibility. Locate the model mapping where the
haiku key is defined in ai-research.ts (the haiku property) and change the
string to the correct model ID or insert a one-line comment above the haiku
entry stating the compatibility reason and date/version, so the inconsistency is
explicit.
| async function refreshOAuthToken(auth: OAuthAuth): Promise<string> { | ||
| const response = await fetch( | ||
| "https://console.anthropic.com/v1/oauth/token", | ||
| { | ||
| method: "POST", | ||
| headers: { "Content-Type": "application/json" }, | ||
| body: JSON.stringify({ | ||
| grant_type: "refresh_token", | ||
| refresh_token: auth.refresh, | ||
| client_id: OAUTH_CLIENT_ID, | ||
| }), | ||
| } | ||
| ) | ||
|
|
||
| if (!response.ok) { | ||
| throw new Error(`OAuth token refresh failed (${response.status})`) | ||
| } | ||
|
|
||
| const json = (await response.json()) as { | ||
| access_token: string | ||
| refresh_token: string | ||
| expires_in: number | ||
| } | ||
|
|
||
| // Update auth.json with new tokens | ||
| try { | ||
| const file = Bun.file(AUTH_FILE) | ||
| const data = await file.json() | ||
| data.anthropic = { | ||
| type: "oauth", | ||
| refresh: json.refresh_token, | ||
| access: json.access_token, | ||
| expires: Date.now() + json.expires_in * 1000, | ||
| } | ||
| await Bun.write(AUTH_FILE, JSON.stringify(data, null, 2)) | ||
| } catch { | ||
| // Non-fatal: token still works for this request even if we can't persist | ||
| } | ||
|
|
||
| return json.access_token | ||
| } |
There was a problem hiding this comment.
Token refresh lacks a timeout and response validation.
Two defensive gaps in refreshOAuthToken:
- No fetch timeout — if
console.anthropic.comis unresponsive, this hangs indefinitely, blocking the caller. - No validation of the response body — if the response lacks
access_tokenorrefresh_token, the function returnsundefinedas the access token and persists broken state to auth.json.
🛡️ Suggested hardening
const response = await fetch(
"https://console.anthropic.com/v1/oauth/token",
{
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
grant_type: "refresh_token",
refresh_token: auth.refresh,
client_id: OAUTH_CLIENT_ID,
}),
+ signal: AbortSignal.timeout(15_000),
}
)
if (!response.ok) {
throw new Error(`OAuth token refresh failed (${response.status})`)
}
const json = (await response.json()) as {
access_token: string
refresh_token: string
expires_in: number
}
+ if (!json.access_token || !json.refresh_token) {
+ throw new Error("OAuth token refresh returned incomplete response")
+ }📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| async function refreshOAuthToken(auth: OAuthAuth): Promise<string> { | |
| const response = await fetch( | |
| "https://console.anthropic.com/v1/oauth/token", | |
| { | |
| method: "POST", | |
| headers: { "Content-Type": "application/json" }, | |
| body: JSON.stringify({ | |
| grant_type: "refresh_token", | |
| refresh_token: auth.refresh, | |
| client_id: OAUTH_CLIENT_ID, | |
| }), | |
| } | |
| ) | |
| if (!response.ok) { | |
| throw new Error(`OAuth token refresh failed (${response.status})`) | |
| } | |
| const json = (await response.json()) as { | |
| access_token: string | |
| refresh_token: string | |
| expires_in: number | |
| } | |
| // Update auth.json with new tokens | |
| try { | |
| const file = Bun.file(AUTH_FILE) | |
| const data = await file.json() | |
| data.anthropic = { | |
| type: "oauth", | |
| refresh: json.refresh_token, | |
| access: json.access_token, | |
| expires: Date.now() + json.expires_in * 1000, | |
| } | |
| await Bun.write(AUTH_FILE, JSON.stringify(data, null, 2)) | |
| } catch { | |
| // Non-fatal: token still works for this request even if we can't persist | |
| } | |
| return json.access_token | |
| } | |
| async function refreshOAuthToken(auth: OAuthAuth): Promise<string> { | |
| const response = await fetch( | |
| "https://console.anthropic.com/v1/oauth/token", | |
| { | |
| method: "POST", | |
| headers: { "Content-Type": "application/json" }, | |
| body: JSON.stringify({ | |
| grant_type: "refresh_token", | |
| refresh_token: auth.refresh, | |
| client_id: OAUTH_CLIENT_ID, | |
| }), | |
| signal: AbortSignal.timeout(15_000), | |
| } | |
| ) | |
| if (!response.ok) { | |
| throw new Error(`OAuth token refresh failed (${response.status})`) | |
| } | |
| const json = (await response.json()) as { | |
| access_token: string | |
| refresh_token: string | |
| expires_in: number | |
| } | |
| if (!json.access_token || !json.refresh_token) { | |
| throw new Error("OAuth token refresh returned incomplete response") | |
| } | |
| // Update auth.json with new tokens | |
| try { | |
| const file = Bun.file(AUTH_FILE) | |
| const data = await file.json() | |
| data.anthropic = { | |
| type: "oauth", | |
| refresh: json.refresh_token, | |
| access: json.access_token, | |
| expires: Date.now() + json.expires_in * 1000, | |
| } | |
| await Bun.write(AUTH_FILE, JSON.stringify(data, null, 2)) | |
| } catch { | |
| // Non-fatal: token still works for this request even if we can't persist | |
| } | |
| return json.access_token | |
| } |
🤖 Prompt for AI Agents
In @.opencode/lib/ai-research.ts around lines 325 - 365, The refreshOAuthToken
function currently can hang and may persist invalid tokens; add a fetch timeout
(use AbortController with a reasonable timeout) when calling
fetch("https://console.anthropic.com/v1/oauth/token") and abort the request on
timeout, and after parsing the JSON validate that json.access_token and
json.refresh_token (and expires_in) are present and of the expected types; if
validation fails, throw a clear Error (do not return undefined) and avoid
writing to AUTH_FILE; only update Bun.file(AUTH_FILE) / Bun.write(AUTH_FILE,
...) when validated fields exist, and keep the existing try/catch around the
write so write failures remain non-fatal but log or include the underlying error
in the thrown or logged message where appropriate (referencing
refreshOAuthToken, AUTH_FILE, OAUTH_CLIENT_ID, Bun.file, Bun.write).
| if (auth?.type === "oauth") { | ||
| let accessToken = auth.access | ||
| if (!accessToken || auth.expires < Date.now()) { | ||
| accessToken = await refreshOAuthToken(auth) | ||
| } | ||
| return { method: "oauth", token: accessToken } | ||
| } |
There was a problem hiding this comment.
Add an expiry buffer to avoid using near-expired tokens.
Line 387 checks auth.expires < Date.now() with zero margin. If the token expires in a few seconds, the subsequent API call will fail mid-flight. Standard practice is to add a buffer (e.g., 60 seconds) to proactively refresh before expiry.
🛡️ Suggested fix
+ const EXPIRY_BUFFER_MS = 60_000 // refresh 60s before expiry
+
if (auth?.type === "oauth") {
let accessToken = auth.access
- if (!accessToken || auth.expires < Date.now()) {
+ if (!accessToken || auth.expires < Date.now() + EXPIRY_BUFFER_MS) {
accessToken = await refreshOAuthToken(auth)
}
return { method: "oauth", token: accessToken }
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| if (auth?.type === "oauth") { | |
| let accessToken = auth.access | |
| if (!accessToken || auth.expires < Date.now()) { | |
| accessToken = await refreshOAuthToken(auth) | |
| } | |
| return { method: "oauth", token: accessToken } | |
| } | |
| const EXPIRY_BUFFER_MS = 60_000 // refresh 60s before expiry | |
| if (auth?.type === "oauth") { | |
| let accessToken = auth.access | |
| if (!accessToken || auth.expires < Date.now() + EXPIRY_BUFFER_MS) { | |
| accessToken = await refreshOAuthToken(auth) | |
| } | |
| return { method: "oauth", token: accessToken } | |
| } |
🤖 Prompt for AI Agents
In @.opencode/lib/ai-research.ts around lines 385 - 391, The token expiry check
in the oauth branch (the block that returns { method: "oauth", token:
accessToken }) compares auth.expires to Date.now() with no margin; change the
condition to proactively refresh when the token is near expiry by using a buffer
(e.g., 60_000 ms). Specifically, in the code that reads auth?.type === "oauth",
update the expiry check from auth.expires < Date.now() to something like
auth.expires < Date.now() + EXPIRY_BUFFER (or auth.expires - EXPIRY_BUFFER <
Date.now()) and define EXPIRY_BUFFER = 60_000 (or similar) in scope so
refreshOAuthToken(auth) is called before the token is about to expire.



Summary
~/.local/share/opencode/auth.jsonas the primary auth path (noANTHROPIC_API_KEYneeded)anthropic-beta: oauth-2025-04-20header +?beta=truequery param for direct Anthropic API access with OAuth tokensANTHROPIC_API_KEYenv var if no OAuth auth is availableopencode serveto be runningHow it works
The
opencode-anthropic-authplugin (built into OpenCode) uses OAuth PKCE flow with Anthropic's console. The access token works directly with the Anthropic Messages API when theanthropic-beta: oauth-2025-04-20header is included. This tool now reads those same tokens fromauth.jsonand makes direct API calls.Testing
api.anthropic.comcall with Bearer token)opencode runwith simple query (2+2) — tool returned correct answerSummary by CodeRabbit
New Features
Updates