Skip to content

t1014: ai-research OAuth auth from auth.json#1475

Merged
marcusquinn merged 1 commit intomainfrom
bugfix/ai-research-oauth-fallback
Feb 14, 2026
Merged

t1014: ai-research OAuth auth from auth.json#1475
marcusquinn merged 1 commit intomainfrom
bugfix/ai-research-oauth-fallback

Conversation

@marcusquinn
Copy link
Owner

@marcusquinn marcusquinn commented Feb 14, 2026

Summary

  • Reads OAuth tokens from ~/.local/share/opencode/auth.json as the primary auth path (no ANTHROPIC_API_KEY needed)
  • Uses anthropic-beta: oauth-2025-04-20 header + ?beta=true query param for direct Anthropic API access with OAuth tokens
  • Handles automatic token refresh when access token expires
  • Falls back to ANTHROPIC_API_KEY env var if no OAuth auth is available
  • Removes the previous OpenCode local API approach (port scanning, session creation) which required opencode serve to be running

How it works

The opencode-anthropic-auth plugin (built into OpenCode) uses OAuth PKCE flow with Anthropic's console. The access token works directly with the Anthropic Messages API when the anthropic-beta: oauth-2025-04-20 header is included. This tool now reads those same tokens from auth.json and makes direct API calls.

Testing

  • Verified OAuth API call works with curl (direct api.anthropic.com call with Bearer token)
  • Tested via opencode run with simple query (2+2) — tool returned correct answer
  • Tested with domain context loading (git domain) — agent files loaded and contextual answer returned
  • Build compiles cleanly with esbuild

Summary by CodeRabbit

  • New Features

    • Introduced OAuth token authentication with automatic refresh capability
    • Added API key fallback authentication option
  • Updates

    • Enhanced authentication error handling with clearer setup guidance
    • Updated Claude model version for improved performance

…ANTHROPIC_API_KEY

Read OAuth tokens from ~/.local/share/opencode/auth.json (primary path),
with ANTHROPIC_API_KEY as fallback. Uses anthropic-beta: oauth-2025-04-20
header for direct Anthropic API access. Handles token refresh automatically.

Removes the OpenCode local API approach (port scanning, session creation)
which required opencode serve to be running separately.
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 14, 2026

Walkthrough

The pull request replaces direct Anthropic API access in .opencode/lib/ai-research.ts with a dual authentication system supporting OAuth tokens (read from auth.json, with automatic refresh and persistence) and API-key fallback (from auth.json or ANTHROPIC_API_KEY environment variable). The haiku model mapping is also updated.

Changes

Cohort / File(s) Summary
Authentication System Overhaul
.opencode/lib/ai-research.ts
Introduced OAuth and API-key authentication flows with functions for reading auth tokens, refreshing OAuth tokens, resolving auth credentials, and making authenticated API calls. Added dynamic header construction (Bearer token + anthropic-beta for OAuth; x-api-key for API-key mode) with enhanced error handling for token refresh failures and missing credentials. Updated haiku model mapping from claude-3-5-haiku-20241022 to claude-3-haiku-20240307.

Sequence Diagram(s)

sequenceDiagram
    participant Caller
    participant readAuthFile as readAuthFile()
    participant refreshOAuth as refreshOAuthToken()
    participant resolveAuth as resolveAuth()
    participant callAnthropic as callAnthropic()
    participant AnthropicAPI as Anthropic API

    Caller->>resolveAuth: research(prompt, model)
    resolveAuth->>readAuthFile: Load auth credentials
    readAuthFile-->>resolveAuth: {oauthToken, apiKey} or env vars
    
    alt OAuth Token Path
        resolveAuth->>refreshOAuth: Check & refresh token if needed
        refreshOAuth->>AnthropicAPI: POST /refresh (if expired)
        AnthropicAPI-->>refreshOAuth: {accessToken, expiresIn}
        refreshOAuth->>readAuthFile: Persist refreshed token
        readAuthFile-->>refreshOAuth: auth.json saved
        refreshOAuth-->>resolveAuth: accessToken
    else API Key Fallback
        resolveAuth-->>resolveAuth: Use apiKey from auth.json or env
    end
    
    resolveAuth->>callAnthropic: Call with resolved credentials
    callAnthropic->>callAnthropic: Build headers (Bearer or x-api-key)
    callAnthropic->>AnthropicAPI: POST /messages (with auth header)
    AnthropicAPI-->>callAnthropic: Response
    callAnthropic-->>Caller: Result
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Possibly related PRs

Poem

🔐 Tokens dance, OAuth flows,
API keys as backup repose,
Headers switch with grace,
Auth finds its place,
Haiku speaks in newer prose! 🎭

🚥 Pre-merge checks | ✅ 3 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Merge Conflict Detection ⚠️ Warning ❌ Merge conflicts detected (2 files):

⚔️ .opencode/lib/ai-research.ts (content)
⚔️ MODELS.md (content)

These conflicts must be resolved before merging into main.
Resolve conflicts locally and push changes to this branch.
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main change: implementing OAuth authentication for ai-research that reads tokens from auth.json, which is the primary objective of the pull request.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch bugfix/ai-research-oauth-fallback
⚔️ Resolve merge conflicts (beta)
  • Auto-commit resolved conflicts to branch bugfix/ai-research-oauth-fallback
  • Create stacked PR with resolved conflicts
  • Post resolved changes as copyable diffs in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@gemini-code-assist
Copy link

Summary of Changes

Hello @marcusquinn, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly refactors the Anthropic API authentication mechanism, shifting from a reliance on environment variables or a local OpenCode API to a more robust system prioritizing OAuth tokens stored in auth.json. This change streamlines the authentication process, includes automatic token refreshing, and removes an external dependency, enhancing the tool's independence and user experience.

Highlights

  • OAuth Integration: Implemented primary authentication using OAuth tokens read from ~/.local/share/opencode/auth.json, providing a more streamlined and secure method.
  • Automatic Token Refresh: Added functionality for automatic refreshing of expired OAuth access tokens, ensuring continuous API access without manual intervention.
  • API Key Fallback: Maintained support for ANTHROPIC_API_KEY environment variable as a fallback authentication method if OAuth tokens are not available.
  • Direct Anthropic API Access: Utilized the anthropic-beta: oauth-2025-04-20 header and ?beta=true query parameter for direct Anthropic API calls when using OAuth.
  • Removed OpenCode Local API Dependency: Eliminated the previous authentication approach that required opencode serve to be running, simplifying the tool's architecture and reducing external dependencies.
Changelog
  • .opencode/lib/ai-research.ts
    • Updated authentication mechanism to prioritize OAuth from auth.json.
    • Implemented OAuth token refresh logic.
    • Refactored Anthropic API call to support both OAuth and API key authentication.
    • Removed direct ANTHROPIC_API_KEY environment variable usage in favor of a new resolveAuth function.
    • Updated the haiku model ID.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@github-actions
Copy link

🔍 Code Quality Report

�[0;35m[MONITOR]�[0m Code Review Monitoring Report

�[0;34m[INFO]�[0m Latest Quality Status:
SonarCloud: 0 bugs, 0 vulnerabilities, 5 code smells

�[0;34m[INFO]�[0m Recent monitoring activity:
Sat Feb 14 13:54:26 UTC 2026: Code review monitoring started
Sat Feb 14 13:54:26 UTC 2026: SonarCloud - Bugs: 0, Vulnerabilities: 0, Code Smells: 5

📈 Current Quality Metrics

  • BUGS: 0
  • CODE SMELLS: 5
  • VULNERABILITIES: 0

Generated on: Sat Feb 14 13:54:29 UTC 2026


Generated by AI DevOps Framework Code Review Monitoring

@sonarqubecloud
Copy link

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a significant improvement by refactoring the authentication mechanism to support OAuth tokens from auth.json, with a fallback to the ANTHROPIC_API_KEY environment variable. The changes are well-structured, separating authentication logic into dedicated functions, which enhances code clarity and maintainability. I have a couple of suggestions to further improve error handling and code maintainability.

const key = process.env.ANTHROPIC_API_KEY
if (key) return key
const AUTH_FILE = `${process.env.HOME || "~"}/.local/share/opencode/auth.json`
const OAUTH_CLIENT_ID = "9d1c250a-e61b-44d9-88ed-5944d1962f5e"

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

To improve maintainability and avoid magic strings, it's a good practice to define all API endpoint URLs as constants. You can define the OAuth token URL and the Anthropic API URL here, and then use these constants in refreshOAuthToken and callAnthropic.

const OAUTH_CLIENT_ID = "9d1c250a-e61b-44d9-88ed-5944d1962f5e"
const OAUTH_TOKEN_URL = "https://console.anthropic.com/v1/oauth/token"
const ANTHROPIC_API_URL = "https://api.anthropic.com/v1/messages"

Comment on lines +360 to +362
} catch {
// Non-fatal: token still works for this request even if we can't persist
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This empty catch block silently ignores errors when persisting the refreshed OAuth token. While this is non-fatal for the current request, it can lead to repeated token refreshes on every subsequent execution if there's a persistent file write issue (e.g., incorrect permissions). This is inefficient and could lead to rate-limiting. Consider logging a warning to inform the user about the persistence failure.

  } catch (e) {
    // Non-fatal: token still works for this request even if we can't persist
    console.warn(`[ai-research] Failed to persist refreshed OAuth token: ${e instanceof Error ? e.message : String(e)}`)
  }

@marcusquinn marcusquinn merged commit b73457e into main Feb 14, 2026
17 of 19 checks passed
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
.opencode/lib/ai-research.ts (1)

443-457: ⚠️ Potential issue | 🟠 Major

API fetch call has no timeout — can hang indefinitely.

Same pattern as the token refresh: no AbortSignal.timeout() on the Anthropic Messages API call. For a sub-worker that's meant to be lightweight, an unbounded hang is particularly problematic as it stalls the caller's workflow.

🛡️ Suggested fix
  const response = await fetch(url, {
    method: "POST",
    headers,
    body: JSON.stringify({
      model: modelId,
      max_tokens: maxTokens,
      system: systemPrompt,
      messages: [{ role: "user", content: req.prompt }],
    }),
+   signal: AbortSignal.timeout(30_000),
  })
🤖 Fix all issues with AI agents
In @.opencode/lib/ai-research.ts:
- Around line 325-365: The refreshOAuthToken function currently can hang and may
persist invalid tokens; add a fetch timeout (use AbortController with a
reasonable timeout) when calling
fetch("https://console.anthropic.com/v1/oauth/token") and abort the request on
timeout, and after parsing the JSON validate that json.access_token and
json.refresh_token (and expires_in) are present and of the expected types; if
validation fails, throw a clear Error (do not return undefined) and avoid
writing to AUTH_FILE; only update Bun.file(AUTH_FILE) / Bun.write(AUTH_FILE,
...) when validated fields exist, and keep the existing try/catch around the
write so write failures remain non-fatal but log or include the underlying error
in the thrown or logged message where appropriate (referencing
refreshOAuthToken, AUTH_FILE, OAUTH_CLIENT_ID, Bun.file, Bun.write).
- Around line 385-391: The token expiry check in the oauth branch (the block
that returns { method: "oauth", token: accessToken }) compares auth.expires to
Date.now() with no margin; change the condition to proactively refresh when the
token is near expiry by using a buffer (e.g., 60_000 ms). Specifically, in the
code that reads auth?.type === "oauth", update the expiry check from
auth.expires < Date.now() to something like auth.expires < Date.now() +
EXPIRY_BUFFER (or auth.expires - EXPIRY_BUFFER < Date.now()) and define
EXPIRY_BUFFER = 60_000 (or similar) in scope so refreshOAuthToken(auth) is
called before the token is about to expire.
- Line 45: The haiku model entry (haiku: "claude-3-haiku-20240307") is pinned to
an older Claude 3.0 version while other models (sonnet, opus) use 4-20250514;
either update the haiku value to the intended current model (e.g.,
claude-3.5-haiku or the matching 4-2025.. identifier) or add a clarifying
comment/docstring explaining why the downgrade is intentional for OAuth/beta
compatibility. Locate the model mapping where the haiku key is defined in
ai-research.ts (the haiku property) and change the string to the correct model
ID or insert a one-line comment above the haiku entry stating the compatibility
reason and date/version, so the inconsistency is explicit.
🧹 Nitpick comments (1)
.opencode/lib/ai-research.ts (1)

310-319: No shape validation on the auth.json anthropic entry.

readAuthFile trusts the JSON structure blindly. If auth.json is malformed (e.g., type: "oauth" but missing refresh or expires), this will silently pass a broken object to refreshOAuthToken, which will POST an undefined refresh token to the OAuth endpoint. Consider a lightweight guard:

🛡️ Suggested validation
   const data = await file.json()
-  return data.anthropic || null
+  const entry = data.anthropic
+  if (!entry || typeof entry !== "object" || !entry.type) return null
+  if (entry.type === "oauth" && (!entry.refresh || typeof entry.expires !== "number")) return null
+  if (entry.type === "api" && !entry.key) return null
+  return entry as AuthEntry


const MODEL_MAP: Record<string, string> = {
haiku: "claude-3-5-haiku-20241022",
haiku: "claude-3-haiku-20240307",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# First, verify the file exists and read the MODEL_MAP section
if [ -f ".opencode/lib/ai-research.ts" ]; then
  echo "=== File found. Reading MODEL_MAP section (lines 40-55) ==="
  head -n 55 ".opencode/lib/ai-research.ts" | tail -n 20
else
  echo "File not found at .opencode/lib/ai-research.ts"
  find . -name "ai-research.ts" -type f 2>/dev/null
fi

Repository: marcusquinn/aidevops

Length of output: 740


🏁 Script executed:

#!/bin/bash
# Search for any comments or context around the model choice
rg -A 5 -B 5 "claude-3.*haiku" .opencode/lib/ai-research.ts

Repository: marcusquinn/aidevops

Length of output: 398


🏁 Script executed:

#!/bin/bash
# Check git history for this specific line if .git exists
if [ -d ".git" ]; then
  echo "=== Git blame for the haiku line ==="
  git blame -L 40,50 ".opencode/lib/ai-research.ts" 2>/dev/null || echo "Git blame not available"
  echo ""
  echo "=== Recent commits to this file ==="
  git log --oneline -n 10 -- ".opencode/lib/ai-research.ts" 2>/dev/null || echo "Git log not available"
else
  echo "No .git directory found"
fi

Repository: marcusquinn/aidevops

Length of output: 1139


🏁 Script executed:

#!/bin/bash
# Search for any comments mentioning OAuth, model selection, or versioning decisions
rg -i "oauth|model.*version|3\.5|3\.0|haiku.*reason|claude.*downgrade|claude.*downgrade" ".opencode/lib/ai-research.ts" -A 2 -B 2

Repository: marcusquinn/aidevops

Length of output: 2673


🏁 Script executed:

#!/bin/bash
# Check the entire MODEL_MAP to understand the pattern
rg -A 20 "MODEL_MAP\s*=" ".opencode/lib/ai-research.ts"

Repository: marcusquinn/aidevops

Length of output: 46


Clarify rationale for haiku model downgrade — version pattern inconsistency

The haiku model is pinned to Claude 3.0 (claude-3-haiku-20240307), while sonnet and opus use the latest versions (4-20250514). If this downgrade is required for OAuth/beta compatibility, document it. Otherwise, Claude 3.5 Haiku offers better quality at the same tier. This version inconsistency should be intentional and explained.

🤖 Prompt for AI Agents
In @.opencode/lib/ai-research.ts at line 45, The haiku model entry (haiku:
"claude-3-haiku-20240307") is pinned to an older Claude 3.0 version while other
models (sonnet, opus) use 4-20250514; either update the haiku value to the
intended current model (e.g., claude-3.5-haiku or the matching 4-2025..
identifier) or add a clarifying comment/docstring explaining why the downgrade
is intentional for OAuth/beta compatibility. Locate the model mapping where the
haiku key is defined in ai-research.ts (the haiku property) and change the
string to the correct model ID or insert a one-line comment above the haiku
entry stating the compatibility reason and date/version, so the inconsistency is
explicit.

Comment on lines +325 to 365
async function refreshOAuthToken(auth: OAuthAuth): Promise<string> {
const response = await fetch(
"https://console.anthropic.com/v1/oauth/token",
{
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
grant_type: "refresh_token",
refresh_token: auth.refresh,
client_id: OAUTH_CLIENT_ID,
}),
}
)

if (!response.ok) {
throw new Error(`OAuth token refresh failed (${response.status})`)
}

const json = (await response.json()) as {
access_token: string
refresh_token: string
expires_in: number
}

// Update auth.json with new tokens
try {
const file = Bun.file(AUTH_FILE)
const data = await file.json()
data.anthropic = {
type: "oauth",
refresh: json.refresh_token,
access: json.access_token,
expires: Date.now() + json.expires_in * 1000,
}
await Bun.write(AUTH_FILE, JSON.stringify(data, null, 2))
} catch {
// Non-fatal: token still works for this request even if we can't persist
}

return json.access_token
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Token refresh lacks a timeout and response validation.

Two defensive gaps in refreshOAuthToken:

  1. No fetch timeout — if console.anthropic.com is unresponsive, this hangs indefinitely, blocking the caller.
  2. No validation of the response body — if the response lacks access_token or refresh_token, the function returns undefined as the access token and persists broken state to auth.json.
🛡️ Suggested hardening
  const response = await fetch(
    "https://console.anthropic.com/v1/oauth/token",
    {
      method: "POST",
      headers: { "Content-Type": "application/json" },
      body: JSON.stringify({
        grant_type: "refresh_token",
        refresh_token: auth.refresh,
        client_id: OAUTH_CLIENT_ID,
      }),
+     signal: AbortSignal.timeout(15_000),
    }
  )

  if (!response.ok) {
    throw new Error(`OAuth token refresh failed (${response.status})`)
  }

  const json = (await response.json()) as {
    access_token: string
    refresh_token: string
    expires_in: number
  }

+ if (!json.access_token || !json.refresh_token) {
+   throw new Error("OAuth token refresh returned incomplete response")
+ }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
async function refreshOAuthToken(auth: OAuthAuth): Promise<string> {
const response = await fetch(
"https://console.anthropic.com/v1/oauth/token",
{
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
grant_type: "refresh_token",
refresh_token: auth.refresh,
client_id: OAUTH_CLIENT_ID,
}),
}
)
if (!response.ok) {
throw new Error(`OAuth token refresh failed (${response.status})`)
}
const json = (await response.json()) as {
access_token: string
refresh_token: string
expires_in: number
}
// Update auth.json with new tokens
try {
const file = Bun.file(AUTH_FILE)
const data = await file.json()
data.anthropic = {
type: "oauth",
refresh: json.refresh_token,
access: json.access_token,
expires: Date.now() + json.expires_in * 1000,
}
await Bun.write(AUTH_FILE, JSON.stringify(data, null, 2))
} catch {
// Non-fatal: token still works for this request even if we can't persist
}
return json.access_token
}
async function refreshOAuthToken(auth: OAuthAuth): Promise<string> {
const response = await fetch(
"https://console.anthropic.com/v1/oauth/token",
{
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
grant_type: "refresh_token",
refresh_token: auth.refresh,
client_id: OAUTH_CLIENT_ID,
}),
signal: AbortSignal.timeout(15_000),
}
)
if (!response.ok) {
throw new Error(`OAuth token refresh failed (${response.status})`)
}
const json = (await response.json()) as {
access_token: string
refresh_token: string
expires_in: number
}
if (!json.access_token || !json.refresh_token) {
throw new Error("OAuth token refresh returned incomplete response")
}
// Update auth.json with new tokens
try {
const file = Bun.file(AUTH_FILE)
const data = await file.json()
data.anthropic = {
type: "oauth",
refresh: json.refresh_token,
access: json.access_token,
expires: Date.now() + json.expires_in * 1000,
}
await Bun.write(AUTH_FILE, JSON.stringify(data, null, 2))
} catch {
// Non-fatal: token still works for this request even if we can't persist
}
return json.access_token
}
🤖 Prompt for AI Agents
In @.opencode/lib/ai-research.ts around lines 325 - 365, The refreshOAuthToken
function currently can hang and may persist invalid tokens; add a fetch timeout
(use AbortController with a reasonable timeout) when calling
fetch("https://console.anthropic.com/v1/oauth/token") and abort the request on
timeout, and after parsing the JSON validate that json.access_token and
json.refresh_token (and expires_in) are present and of the expected types; if
validation fails, throw a clear Error (do not return undefined) and avoid
writing to AUTH_FILE; only update Bun.file(AUTH_FILE) / Bun.write(AUTH_FILE,
...) when validated fields exist, and keep the existing try/catch around the
write so write failures remain non-fatal but log or include the underlying error
in the thrown or logged message where appropriate (referencing
refreshOAuthToken, AUTH_FILE, OAUTH_CLIENT_ID, Bun.file, Bun.write).

Comment on lines +385 to 391
if (auth?.type === "oauth") {
let accessToken = auth.access
if (!accessToken || auth.expires < Date.now()) {
accessToken = await refreshOAuthToken(auth)
}
return { method: "oauth", token: accessToken }
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Add an expiry buffer to avoid using near-expired tokens.

Line 387 checks auth.expires < Date.now() with zero margin. If the token expires in a few seconds, the subsequent API call will fail mid-flight. Standard practice is to add a buffer (e.g., 60 seconds) to proactively refresh before expiry.

🛡️ Suggested fix
+ const EXPIRY_BUFFER_MS = 60_000 // refresh 60s before expiry
+
  if (auth?.type === "oauth") {
    let accessToken = auth.access
-   if (!accessToken || auth.expires < Date.now()) {
+   if (!accessToken || auth.expires < Date.now() + EXPIRY_BUFFER_MS) {
      accessToken = await refreshOAuthToken(auth)
    }
    return { method: "oauth", token: accessToken }
  }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if (auth?.type === "oauth") {
let accessToken = auth.access
if (!accessToken || auth.expires < Date.now()) {
accessToken = await refreshOAuthToken(auth)
}
return { method: "oauth", token: accessToken }
}
const EXPIRY_BUFFER_MS = 60_000 // refresh 60s before expiry
if (auth?.type === "oauth") {
let accessToken = auth.access
if (!accessToken || auth.expires < Date.now() + EXPIRY_BUFFER_MS) {
accessToken = await refreshOAuthToken(auth)
}
return { method: "oauth", token: accessToken }
}
🤖 Prompt for AI Agents
In @.opencode/lib/ai-research.ts around lines 385 - 391, The token expiry check
in the oauth branch (the block that returns { method: "oauth", token:
accessToken }) compares auth.expires to Date.now() with no margin; change the
condition to proactively refresh when the token is near expiry by using a buffer
(e.g., 60_000 ms). Specifically, in the code that reads auth?.type === "oauth",
update the expiry check from auth.expires < Date.now() to something like
auth.expires < Date.now() + EXPIRY_BUFFER (or auth.expires - EXPIRY_BUFFER <
Date.now()) and define EXPIRY_BUFFER = 60_000 (or similar) in scope so
refreshOAuthToken(auth) is called before the token is about to expire.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant