-
Notifications
You must be signed in to change notification settings - Fork 841
Update to OpenAI 2.6.0 #6996
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update to OpenAI 2.6.0 #6996
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This pull request adds comprehensive support for function result content types in the OpenAI Responses API client and enables per-request model ID overrides across all OpenAI clients. The implementation includes extensive test coverage for various content types in tool call results.
Key changes:
- Added support for
AIContenttypes (TextContent,DataContent,UriContent,HostedFileContent) as function results in the Responses API - Implemented per-request
ModelIdoverride capability inChatOptionsfor Chat Completion, Responses, and Embedding clients - Updated to OpenAI SDK 2.6.0 and added
HasTopLevelMediaTypemethod toHostedFileContent
Reviewed Changes
Copilot reviewed 14 out of 14 changed files in this pull request and generated 3 comments.
Show a summary per file
| File | Description |
|---|---|
| OpenAIResponsesChatClient.cs | Adds serialization logic for AIContent types in tool call results, implements model override via JsonPatch, and reorders annotation handling |
| OpenAIClientExtensions.cs | Adds PatchModelIfNotSet helper method for setting model in JsonPatch when not already set |
| OpenAIChatClient.cs | Adds model override support using PatchModelIfNotSet |
| OpenAIEmbeddingGenerator.cs | Adds model override support using PatchModelIfNotSet |
| OpenAIJsonContext.cs | Adds JSON serialization support for FunctionToolCallOutputElement list |
| HostedFileContent.cs | Adds HasTopLevelMediaType method to check media type prefix |
| OpenAIResponseClientTests.cs | Adds 33 new unit tests covering model overrides, tool result content types, and edge cases |
| OpenAIResponseClientIntegrationTests.cs | Adds 5 integration tests for tool call results with various content types |
| OpenAIChatClientTests.cs | Adds 2 unit tests for model override in streaming and non-streaming scenarios |
| Microsoft.Extensions.AI.OpenAI.csproj | Adds SCME0001 warning suppression and removes unused injector properties |
| General.props | Updates OpenAI package version to 2.6.0 |
| CHANGELOG.md files | Documents new features and fixes |
| API baseline file | Adds new HasTopLevelMediaType API to manifest |
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesChatClient.cs
Show resolved
Hide resolved
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesChatClient.cs
Show resolved
Hide resolved
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesChatClient.cs
Outdated
Show resolved
Hide resolved
In addition to updating the dependency, it now enables us to do additional things:
- Propagate ChatOptions/EmbeddingGeneratorOptions.ModelID to allow overriding the model per-request.
- Propagate AIContent results of tool invocations using non-string tool results.
- Add support for container file annotations.
- Fixed handling of GetResponse{Streaming}Async to use RequestOptions
- Remove hacky serialization code around ResponseCreationOptions.
A previous PR added HostedFileContent.Name/MediaType, but we missed adding HasTopLevelMediaType (which both DataContent and UriContent have). I had a need for that here, so included it.
4da8c05 to
080abae
Compare
* Update to OpenAI 2.6.0
In addition to updating the dependency, it now enables us to do additional things:
- Propagate ChatOptions/EmbeddingGeneratorOptions.ModelID to allow overriding the model per-request.
- Propagate AIContent results of tool invocations using non-string tool results.
- Add support for container file annotations.
- Fixed handling of GetResponse{Streaming}Async to use RequestOptions
- Remove hacky serialization code around ResponseCreationOptions.
A previous PR added HostedFileContent.Name/MediaType, but we missed adding HasTopLevelMediaType (which both DataContent and UriContent have). I had a need for that here, so included it.
* Add more tests based on code coverage gaps
* Fix handling of role in AsChatMessages
* Update to OpenAI 2.6.0
In addition to updating the dependency, it now enables us to do additional things:
- Propagate ChatOptions/EmbeddingGeneratorOptions.ModelID to allow overriding the model per-request.
- Propagate AIContent results of tool invocations using non-string tool results.
- Add support for container file annotations.
- Fixed handling of GetResponse{Streaming}Async to use RequestOptions
- Remove hacky serialization code around ResponseCreationOptions.
A previous PR added HostedFileContent.Name/MediaType, but we missed adding HasTopLevelMediaType (which both DataContent and UriContent have). I had a need for that here, so included it.
* Add more tests based on code coverage gaps
* Fix handling of role in AsChatMessages
In addition to updating the dependency, it now enables us to do additional things:
A previous PR added HostedFileContent.Name/MediaType, but we missed adding HasTopLevelMediaType (which both DataContent and UriContent have). I had a need for that here, so included it.
Also added some more unit tests to improve code coverage for the response chat client.
And also fixes #6997.
Microsoft Reviewers: Open in CodeFlow