Skip to content

Add GLM 5#5805

Merged
kevinvandijk merged 2 commits intoKilo-Org:mainfrom
Neonsy:feat/add-glm-5
Feb 11, 2026
Merged

Add GLM 5#5805
kevinvandijk merged 2 commits intoKilo-Org:mainfrom
Neonsy:feat/add-glm-5

Conversation

@Neonsy
Copy link
Contributor

@Neonsy Neonsy commented Feb 11, 2026

Context

This PR updates Z.ai support to make glm-5 the default model and keep model selection consistent across all Z.ai API lines (international_coding, international_api, china_coding, china_api) in the extension/webview flow.

Implementation

  • Switched Z.ai default model IDs from glm-4.7 to glm-5 in the Z.ai provider model catalogs.
  • Fixed mainland line handling so both china_coding and china_api resolve to the mainland model catalog in:
    • provider model resolution
    • webview model list hooks
    • selected-model resolution
    • settings default-model reset behavior
  • Added/updated tests for:
    • GLM-5 selection on coding and non-coding Z.ai lines
    • china_api mapping to mainland defaults/model sets
    • settings behavior when provider/line changes
  • Added a dedicated changeset for this work.

Where the values come from

  • Primary source: official Z.ai GLM-5 docs (https://docs.z.ai/guides/llm/glm-5).
  • Validation source: live /models and /chat/completions probes against Z.ai endpoints.
  • OpenRouter model metadata was used only as a cross-check, not as the canonical source for Z.ai endpoint behavior.

How to Test

  • Run provider tests:
    • cd src && pnpm exec vitest run api/providers/__tests__/zai.spec.ts
  • Run webview tests:
    • cd webview-ui && pnpm exec vitest run src/components/kilocode/hooks/__tests__/getModelsByProvider.spec.ts
    • cd webview-ui && pnpm exec vitest run src/components/settings/__tests__/ApiOptions.spec.tsx
  • Manual settings check:
    • Open settings, choose provider zai.
    • Set zaiApiLine to china_api.
    • Switch providers away/back if needed.
    • Confirm default model resolves from mainland Z.ai defaults and is now glm-5.
  • Optional live sanity checks:
    • Verify glm-5 appears on Z.ai /models responses for each supported line.
    • Verify glm-5 chat completion succeeds on coding lines (non-coding line behavior may depend on account/package state).

Note

I did not test if the provider works myself

@changeset-bot
Copy link

changeset-bot bot commented Feb 11, 2026

🦋 Changeset detected

Latest commit: 117481d

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
kilo-code Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@Neonsy Neonsy changed the title Initial assumptions based on probing Add GLM 5 Feb 11, 2026
@Neonsy Neonsy marked this pull request as ready for review February 11, 2026 18:37
@Neonsy
Copy link
Contributor Author

Neonsy commented Feb 11, 2026

@mikij @bernaferrari @kevinvandijk

@Neonsy
Copy link
Contributor Author

Neonsy commented Feb 11, 2026

Why are the workflows waiting for approval 👀

@bernaferrari
Copy link
Contributor

bernaferrari commented Feb 11, 2026

Works well. I tested here. The only possible problem is that GLM 5 is still not available for non-max plans, but will be next week. So I guess can leave it default. People with API key know what they are doing.

Copy link
Collaborator

@kevinvandijk kevinvandijk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @Neonsy! I made a slight update to the changeset note but everthing else looks good!

@kevinvandijk kevinvandijk merged commit 95c633e into Kilo-Org:main Feb 11, 2026
12 checks passed
@github-actions github-actions bot mentioned this pull request Feb 11, 2026
@Neonsy Neonsy deleted the feat/add-glm-5 branch February 11, 2026 22:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants