Skip to content

feat(provider): add 1M context window support for Anthropic models#14375

Open
DusKing1 wants to merge 1 commit intoanomalyco:devfrom
DusKing1:feat/1m-context-support
Open

feat(provider): add 1M context window support for Anthropic models#14375
DusKing1 wants to merge 1 commit intoanomalyco:devfrom
DusKing1:feat/1m-context-support

Conversation

@DusKing1
Copy link

@DusKing1 DusKing1 commented Feb 20, 2026

Issue for this PR

Closes #12323

Type of change

  • Bug fix
  • New feature
  • Refactor / code improvement
  • Documentation

What does this PR do?

Enables the 1M token context window for Anthropic models that support it (Opus 4.6, Sonnet 4.6, Sonnet 4.5, Sonnet 4.0).

Three changes:

  1. Beta header (provider.ts): Appends context-1m-2025-08-07 to the anthropic-beta header. This is the only thing the Anthropic API requires to unlock 1M context.

  2. Context limit override (provider.ts): models.dev currently reports context: 200000 for these models, which causes compaction to trigger at ~168K tokens. The code now detects 1M-capable models by API ID pattern and overrides limit.context to 1_000_000. Guarded by < 1_000_000 so it becomes a no-op once models.dev updates.

  3. Compaction fix (compaction.ts): When model.limit.input is set, the overflow check now counts only input tokens instead of total tokens. Output/thinking tokens don't consume the input context window.

This is a minimal alternative to #12342 — same 1M result without the AI SDK v5-v6 upgrade.

How did you verify your code works?

  • All 24 compaction tests pass (the previously-documented bug test now asserts the corrected behavior)
  • LSP diagnostics clean on both changed files
  • CI typecheck and unit tests pass

Checklist

  • I have tested my changes locally
  • I have not included unrelated changes in this PR

@github-actions github-actions bot added the needs:compliance This means the issue will auto-close after 2 hours. label Feb 20, 2026
@github-actions
Copy link
Contributor

The following comment was made by an LLM, it may be inaccurate:

Based on my search, I found one related PR:

PR #12342: "feat(provider): add adaptive thinking and 1M context support for Claude Opus 4.6"
#12342

This PR is related because it addresses the same feature (1M context window support for Anthropic models, specifically Claude Opus 4.6). According to the current PR description, PR #14375 is "a minimal alternative to #12342 — no AI SDK version upgrade required," indicating these PRs take different approaches to solving the same problem.

@github-actions github-actions bot removed the needs:compliance This means the issue will auto-close after 2 hours. label Feb 20, 2026
@github-actions
Copy link
Contributor

Thanks for updating your PR! It now meets our contributing guidelines. 👍

@sergical
Copy link
Contributor

+1 on this one, a bit annoying to have opencode keep hitting 200k limit thinking its at 20% of the context rn 😓

@jasgeo75
Copy link

any plans to merge this?

Add context-1m-2025-08-07 beta header for Anthropic provider, enabling
1M token context window for Claude Opus 4.6, Sonnet 4.6, Sonnet 4.5,
and Sonnet 4.0.

Override context limit from 200K to 1M for supported models so compaction
triggers at the correct threshold instead of prematurely at 200K.

Fix compaction token counting: when model.limit.input is set, only count
input tokens (not output/thinking) against the input limit. Output tokens
do not consume the input context window.
@DusKing1 DusKing1 force-pushed the feat/1m-context-support branch from 5103e1e to d471249 Compare March 2, 2026 15:57
vbuccigrossi pushed a commit to vbuccigrossi/opencode that referenced this pull request Mar 7, 2026
Tier 1 bug fixes:
- Fix O(n²) bash output concatenation with StreamingOutput class (anomalyco#9693)
- Fix memory leaks in Bus.once, Format, Plugin, ShareNext, Bootstrap (anomalyco#13514)
- Fix FileTime race condition using actual file mtime instead of JS clock
- Free memory on compaction prune: clear output/attachments/metadata (anomalyco#7049)
- Throttle reasoning-delta storage writes to 50ms intervals (anomalyco#11328)
- Handle SIGHUP/SIGTERM to prevent orphaned processes (anomalyco#12718)
- Add process.once("close") handler for bash tool reliability

Tier 2 features:
- Support 1M context window for Anthropic models via beta header (anomalyco#14375)
- Input-only token counting for compaction with limit.input models
- MCP lazy loading: on-demand tool discovery via mcp_search tool (anomalyco#8771)
- MCP servers listed in system prompt when lazy mode enabled
- StreamingOutput: output_filter regex for build diagnostics
- LSP server cleanup callback for temp directory removal
- Extract formatSize utility from uninstall to shared util/format
- GitHub CI: fix Bus subscription leak in session event handler

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[FEATURE]: Add support of Claude Opus 4.6

3 participants