Skip to content

Conversation

@processtrader
Copy link
Contributor

Summary

Fix token extraction for ZAI/Anthropic API, improve auto-compaction timing for idle sessions, and add dedicated system prompt for GLM-4.6 model.

Changes

Token Extraction Fix (session/index.ts)

  • Problem: ZAI (Anthropic endpoint) returns token counts in metadata.anthropic.usage instead of the top-level usage object, causing the sidebar to show 0 tokens.
  • Solution: Extract tokens from metadata.anthropic.usage when top-level values are missing/zero.
  • Supports both camelCase (inputTokens) and snake_case (input_tokens) field names.
  • Also supports Bedrock metadata format for consistency.

Compaction Timing Fix (session/prompt.ts)

  • Problem: Auto-compaction check happened after the exit condition in the main loop, so idle sessions at high context usage would exit before compaction could trigger.
  • Solution: Move compaction check before the exit condition.
  • Added hasPendingCompaction guard to prevent infinite loops when a compaction task is already queued.

GLM System Prompt (session/prompt/glm.txt)

Added a dedicated system prompt optimized for GLM-4.6's capabilities:

  • Why? GLM-4.6 has distinct strengths (200K context, advanced reasoning, 30% better token efficiency, stronger agentic performance) that benefit from tailored instructions.
  • Concise structure - Leaner prompt (~57 lines) leverages GLM's token efficiency, leaving more context for actual work.
  • Reasoning-first approach - Added "Reasoning Approach" section to encourage systematic problem breakdown.
  • Tool-agnostic language - Avoids explicit tool names to prevent agents (like Plan) from attempting unauthorized operations.
  • Consistent style - Follows similar structure to anthropic.txt for maintainability.

Testing

  • Verified token counts display correctly in sidebar with ZAI endpoint
  • Confirmed auto-compaction triggers at ~84% context usage
  • Tested compaction works on both active and idle sessions

@processtrader processtrader force-pushed the zai-coding-plan-glm-anthropic branch from 841cdbf to 2f05145 Compare November 27, 2025 20:56
IgorWarzocha added a commit to IgorWarzocha/opencode that referenced this pull request Dec 31, 2025
Implements comprehensive GLM (Zhipu AI) model integration:

Core Features:
- Add GLM model detection and routing in session system
- Extract token counts from ZAI's Anthropic-compatible metadata structure
- Add ZAI provider to auth menu (zai-coding-plan, GLM-4.7)
- Create comprehensive GLM system prompt with strict engineering constraints

Technical Changes:
- packages/opencode/src/session/index.ts: Extract tokens from metadata when top-level usage is 0
- packages/opencode/src/session/system.ts: Route GLM models to specialized system prompt
- packages/opencode/src/session/prompt/glm.txt: New rigorous system prompt with XML-structured constraints
- packages/opencode/src/provider/provider.ts: Add zai-coding-plan loader
- packages/opencode/src/provider/models.ts: Inject default ZAI provider definition
- packages/opencode/src/cli/cmd/auth.ts: Add ZAI to provider priority list (1.5)

Refs: Adapted from upstream PR anomalyco#4710 with enhanced system prompt engineering
@thdxr thdxr force-pushed the dev branch 3 times, most recently from f1ae801 to 08fa7f7 Compare January 30, 2026 14:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants