fix(desktop): PR #313 small-model regression (OAuth + provider fallback)#318
fix(desktop): PR #313 small-model regression (OAuth + provider fallback)#318
Conversation
…regression) PR #313 (mastracode 0.14 + small-model refactor) shipped three take-home regressions for fork users: 1. Anthropic OAuth / AUTH_TOKEN broken. get-small-model routed the token through createAnthropic({ apiKey }), but fork's OAuth / managed-proxy credentials require authToken + anthropic-beta / user-agent / x-app headers (see getAnthropicProviderOptions). Users connected via Claude Code OAuth or a gateway that only emits ANTHROPIC_AUTH_TOKEN had all small-model tasks 401. 2. Provider fallback dropped. Old callSmallModel iterated Anthropic then OpenAI. The post-superset-sh#3517 shim collapsed to a single attempt, so a failing Anthropic account never fell through to OpenAI. 3. OAuth-only users saw 'Account not connected'. When getSmallModel returned null (because auth.json only had OAuth entries), the shim fabricated two 'missing-credentials' attempts regardless of what was actually connected. Fix: replace the simple getSmallModel() with a fork-maintained getSmallModelCandidates() that returns the full priority list (Anthropic env -> keychain/config -> auth-storage -> managed env config -> OpenAI env -> OpenAI auth-storage) with OAuth / API key / AUTH_TOKEN each routed through the correct AI-SDK provider options (getAnthropicProviderOptions for Anthropic, createOpenAICodexOAuthModel for OpenAI Codex OAuth). getSmallModel() stays as an upstream- compatible wrapper over the first viable candidate. The callSmallModel shim now iterates candidates and surfaces per-attempt outcomes so describeEnhanceFailure keeps its user messages intact. Restored behaviors: - Anthropic OAuth / AUTH_TOKEN via fork-standard header set. - Anthropic managed env config (~/.superset/chat-anthropic-env.json) with both ANTHROPIC_API_KEY (api-key path) and ANTHROPIC_AUTH_TOKEN (OAuth path). - OpenAI Codex OAuth (rewrites to Codex backend, refreshes access token via mastracode authStorage). - OpenAI stock / openai-codex API-key slot. - Provider fallback: Anthropic failure now tries OpenAI.
|
Warning Rate limit exceeded
Your organization is not enrolled in usage-based pricing. Contact your admin to enable usage-based pricing to continue reviews beyond the rate limit, or try again in 11 minutes and 22 seconds. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: Organization UI Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (4)
✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
…egression" This reverts commit 83eca3d.
概要
PR #313 (mastracode 0.14 + small-model refactor, 2026-04-18 マージ) で fork 固有の auth 経路を壊した 3 件の regression を修正する hotfix。Codex 3 ラウンドのレビュー後 Yes。
修正対象 regression(取り込み時のバグ)
1. Anthropic OAuth / AUTH_TOKEN が
apiKeyに誤ルート(High)PR #313 の get-small-model.ts は
ANTHROPIC_AUTH_TOKENを含むあらゆる credential をcreateAnthropic({ apiKey })に流していた。fork の OAuth / gateway proxy 構成ではauthToken + headers(anthropic-beta: "claude-code-20250219,oauth-2025-04-20",user-agent: "claude-cli/...",x-app: "cli") が必要で、401 になる。修正:
getAnthropicProviderOptions(credentials)を経由し、kind: "oauth"は OAuth header path に流す。AUTH_TOKENはkind: "oauth"としてClaudeCredentialsに詰めて渡す。2. Provider fallback 喪失(Medium)
旧
callSmallModelは Anthropic 失敗時に OpenAI へフォールバックしていたが、PR #313 の shim は 1 attempt に縮退。修正: shim を
getSmallModelCandidates()を iterate する実装に戻し、各 candidate で createModel + invoke を試行、失敗時は次へ fall-through。getSmallModel()も全候補を iterate するよう変更。3. OAuth-only ユーザーに誤って「接続されていません」(High)
auth.json に OAuth entry しか無い場合、
apikey:<provider>slot が無いので旧 shim はmissing-credentials2件を返した。修正: OAuth 資格も候補に積むように credential resolver を拡張。
実装概要
packages/chat/src/server/shared/small-model/get-small-model.tsを再構築getSmallModelCandidates()を新設:Anthropic/OpenAI 候補を優先順序で返すenv ANTHROPIC_API_KEY→~/.claude*config / keychain (fork sync helpers) → auth-storageanthropicslot (sync, refresh なし) → managed env config~/.superset/chat-anthropic-env.json(parseAnthropicEnvTextで fork 正式パーサ使用、ANTHROPIC_API_KEYかANTHROPIC_AUTH_TOKEN)env OPENAI_API_KEY→getOpenAICredentialsFromAnySource(fork のopenai/openai-codex両 slot)getAnthropicProviderOptions経由で正しい AI-SDK options に変換createOpenAICodexOAuthModel(mastracode authStorage で token refresh + Codex backend endpoint rewrite、旧 small-model.ts と同一実装)getSmallModel()は全候補を iterate、createModel()throw なら次へapps/desktop/src/lib/ai/call-small-model.tsshim 再実装getSmallModelCandidates()を iteratecreateModel + invokeを試行、失敗時はSmallModelAttemptに記録して次の candidate へempty-result/failed/succeededを適切に記録missing-credentials2件を返す検証
Codex pre-review
3 ラウンドで最終 Yes:
getSmallModel()iteration 追加、expired 候補 skipPre-existing な別問題(本 hotfix 対象外)
getSmallModel()直利用側 (runtime.ts 等) で invoke 時の 401 は依然 fall-through しない。candidate 構築時の filtering で大部分 mitigate されるが、invalid token を持っていて expired frag が立たないケースは未対応。これは PR upstream取り込み: mastracode 0.14 + small-model refactor (#3517) #313 以前から同様の制限があり、今回の hotfix で悪化していないテストチェックリスト
ANTHROPIC_AUTH_TOKENで proxy 経由で動くANTHROPIC_API_KEYで動くopenai-codexslot) で動く