Skip to content

Comments

Cherry pick Kimi AI SDK changes from upstream#5662

Merged
kevinvandijk merged 4 commits intomainfrom
cherry-pick-kimi-sdk
Feb 5, 2026
Merged

Cherry pick Kimi AI SDK changes from upstream#5662
kevinvandijk merged 4 commits intomainfrom
cherry-pick-kimi-sdk

Conversation

@kevinvandijk
Copy link
Collaborator

No description provided.

@changeset-bot
Copy link

changeset-bot bot commented Feb 4, 2026

🦋 Changeset detected

Latest commit: 228745b

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
kilo-code Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

input: unknown
}> = []

for (const part of message.content) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

WARNING: Assistant content ordering is not preserved when converting tool calls

In convertToAiSdkMessages(), assistant text parts are accumulated separately and then emitted before all tool-call parts. This can change semantics vs. the original Anthropic message where text/tool_use blocks may be interleaved (e.g., text → tool call → more text).

Consider building the AI SDK content array in a single pass over message.content to preserve the original ordering.

model: languageModel,
prompt,
maxOutputTokens: this.getMaxOutputTokens(),
temperature: this.config.temperature ?? 0,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SUGGESTION: completePrompt() ignores model-derived temperature

createMessage() uses model.temperature ?? this.config.temperature ?? 0, but completePrompt() uses only this.config.temperature ?? 0. If getModel() supplies a non-zero default temperature, completePrompt() will behave differently than streaming.

Consider using the same precedence as createMessage() (model-derived temperature first) for consistency.

@kevinvandijk kevinvandijk marked this pull request as draft February 4, 2026 20:16
@kiloconnect
Copy link
Contributor

kiloconnect bot commented Feb 4, 2026

Code Review Summary

Status: 8 Issues Found | Recommendation: Address before merge

Overview

Severity Count
CRITICAL 0
WARNING 4
SUGGESTION 4

Fix these issues in Kilo Cloud

Issue Details (click to expand)

WARNING

File Line Issue
src/api/transform/ai-sdk.ts 121 Assistant content ordering is not preserved when converting tool calls
src/api/transform/ai-sdk.ts 99 User content ordering can change when tool results are emitted
src/api/providers/moonshot.ts 1 Unused ModelInfo type import may fail lint
package.json 86 Mixed Zod majors via pnpm overrides can cause subtle runtime/type mismatches

SUGGESTION

File Line Issue
src/api/providers/openai-compatible.ts 207 completePrompt() ignores model-derived temperature
src/api/providers/openai-compatible.ts 7 Make the openai import type-only
src/api/transform/ai-sdk.ts 7 Make the openai import type-only
src/api/providers/openai-compatible.ts 39 useMaxTokens config is currently unused/misleading
Files Reviewed (12 files)

@kevinvandijk kevinvandijk marked this pull request as ready for review February 4, 2026 22:56
@@ -1,23 +1,29 @@
import OpenAI from "openai"
import { moonshotModels, moonshotDefaultModelId, type ModelInfo } from "@roo-code/types"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

WARNING: Unused ModelInfo type import may fail lint

ModelInfo is imported but never referenced in this file. If ESLint/TS settings enforce no-unused-vars/imports, this can break CI.

Suggested change
import { moonshotModels, moonshotDefaultModelId, type ModelInfo } from "@roo-code/types"
import { moonshotModels, moonshotDefaultModelId } from "@roo-code/types"

*/

import { Anthropic } from "@anthropic-ai/sdk"
import OpenAI from "openai"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SUGGESTION: Make the openai import type-only

OpenAI appears to be used only for the tool_choice type in OpenAICompatibleHandler.mapToolChoice(). Importing it as a value can add an unnecessary runtime dependency in the bundled output.

Suggested change
import OpenAI from "openai"
import type OpenAI from "openai"

*/

import { Anthropic } from "@anthropic-ai/sdk"
import OpenAI from "openai"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SUGGESTION: Make the openai import type-only

OpenAI is only referenced in type positions (tool definitions). Using a type-only import avoids an unnecessary runtime require/bundle edge case.

Suggested change
import OpenAI from "openai"
import type OpenAI from "openai"

// UserContent only supports: string | Array<TextPart | ImagePart | FilePart>
// ToolContent (for role: "tool") supports: Array<ToolResultPart | ToolApprovalResponse>
if (toolResults.length > 0) {
modelMessages.push({
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

WARNING: User content ordering can change when tool results are emitted

In convertToAiSdkMessages(), user tool_result parts are collected separately and then emitted as a role: "tool" message before the role: "user" text/image parts. If the original message.content interleaves text/images with tool results, this reorders content and can change semantics.

Consider building output messages in a single pass that preserves the original order (or explicitly document/guarantee the expected ordering invariants from upstream).

/** Optional custom headers */
headers?: Record<string, string>
/** Whether to include max_tokens in requests (default: false uses max_completion_tokens) */
useMaxTokens?: boolean
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SUGGESTION: useMaxTokens config is currently unused/misleading

OpenAICompatibleConfig.useMaxTokens is documented as controlling max_tokens vs max_completion_tokens, but the handler always sends AI SDK maxOutputTokens and never reads useMaxTokens.

Either implement this flag (if needed for provider quirks) or remove/rename it to avoid confusion.

"@types/react": "^18.3.23",
"@types/react-dom": "^18.3.5"
"@types/react-dom": "^18.3.5",
"zod": "3.25.76",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

WARNING: Mixed Zod majors via pnpm overrides can cause subtle runtime/type mismatches

This pins zod to 3.25.76 globally while also forcing @sap-ai-sdk/prompt-registry to use zod@^4. That will install two Zod majors in the repo, which can lead to issues like failed instanceof ZodError checks and schema/value incompatibilities across package boundaries.

If possible, consider aligning on a single Zod major (or ensure any cross-boundary error handling avoids instanceof/direct class identity checks).

@kevinvandijk kevinvandijk merged commit 58dcd61 into main Feb 5, 2026
12 checks passed
@kevinvandijk kevinvandijk deleted the cherry-pick-kimi-sdk branch February 5, 2026 12:20
@github-actions github-actions bot mentioned this pull request Feb 5, 2026
@kevinvandijk kevinvandijk mentioned this pull request Feb 5, 2026
3 tasks
@smetanokr
Copy link

not working -> i can help you by sharing moonshot - kimi code API key for debug @kevinvandijk

@CoinAnole
Copy link
Contributor

Confirming: using Moonshot as the API provider with their Kimi Code Plan and this fixes error 400 but completely breaks tool calls.

Extension version: 5.4.0
Provider: moonshot
Model: kimi-k2.5

MODEL_NO_TOOLS_USED

[ERROR] You did not use a tool in your previous response!...

@stingh711
Copy link

Still doesn't work for kimi-for-coding:
Extension version: 5.4.0
Provider: moonshot
Model: kimi-for-coding
Entrypoint: api.kimi.com/coding/v1

image

@smetanokr
Copy link

created new issue for it #5719 as this one seems to be closed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants