-
Notifications
You must be signed in to change notification settings - Fork 530
feat(js/plugins/compat-oai): Add reasoning_content and json_schema support to OpenAI Compatible API plugin #3679
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA). View this failed invocation of the CLA check for more information. For the most up to date status, view the checks section at the bottom of the pull request. |
Hi @pavelgj, just wanted to kindly check if you’ve had a chance to look at this PR. |
@ssbushi could you please take a look? |
This commit adds comprehensive support for handling reasoning_content in OpenAI Compatible API Chat Completions responses, supporting both streaming and non-streaming scenarios. The implementation ensures that models supporting reasoning_content can properly return their reasoning traces, with optimized content assembly and enhanced response_format handling. **CHANGELOG:** - [x] Add reasoning_content handling in fromOpenAIChoice and fromOpenAIChunkChoice - [x] Implement direct content assembly by pushing reasoning and content parts to content array - [x] Update response_format handling in toOpenAIRequestBody to support schema-based responses - [x] Add comprehensive test cases for reasoning_content in both streaming and non-streaming functions - [x] Add test for json_schema response_format in toOpenAIRequestBody **Key features:** - Enables proper reasoning trace returns from supporting models - Enhanced compatibility with schema-based response formats - Complete test coverage for new reasoning_content functionality
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR. Looks good overall, I just need a bit of clarification on the json schema change. :)
if (request.output?.schema) { | ||
body.response_format = { | ||
type: 'json_schema', | ||
json_schema: { | ||
name: 'output', | ||
schema: request.output!.schema, | ||
}, | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I could not find API reference for json_schema
for DeepSeek. Are you sure this works?
http://api-docs.deepseek.com/api/create-chat-completion#request
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hi @ssbushi thanks for your review, yes, DeepSeek didn't support json_schema
so there still has fallback for json_object
when integrate to DeepSeek, user can set the output.format without JsonSchema
e.g. ai.generate({output: {format: 'json'}});
then it will fallback to
body.response_format = {
type: 'json_object',
};
I'm follow these two document for implement json_schema
part
LiteLLM
OpenAI
xAI
This commit adds comprehensive support for handling reasoning_content in OpenAI Compatible API Chat Completions responses, supporting both streaming and non-streaming scenarios. The implementation ensures that models supporting reasoning_content can properly return their reasoning traces, with optimized content assembly and enhanced response_format handling.
CHANGELOG:
Description here... Help the reviewer by:
OpenAI it self not support
reasoning_content
butsupport
reasoning_content
when using reasoning modelChecklist (if applicable):