-
Notifications
You must be signed in to change notification settings - Fork 791
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Azure.RequestFailedException: 'response_format value as json_schema is enabled only for api versions 2024-08-01-preview and later #6130
Comments
Needs investigation to find root cause |
@RussKie Does this repro when not using MEAI, but using A.AI.Inference directly? |
I tried with Azure.AI.OpenAI (2.0.0-2.2.0-beta.2), and I get no issues <Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net9.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.Hosting" Version="9.0.1" />
<PackageReference Include="Azure.AI.OpenAI" Version="2.2.0-beta.2" />
</ItemGroup>
</Project> // See https://aka.ms/new-console-template for more information
using System.Text.Json;
using Azure;
using Azure.AI.OpenAI;
using Microsoft.Extensions.Hosting;
using OpenAI.Chat;
Console.WriteLine("Hello, World!");
HostApplicationBuilder builder = Host.CreateApplicationBuilder();
AzureKeyCredential credential = new(Environment.GetEnvironmentVariable("GITHUB_TOKEN")!);
var azureOptions = new AzureOpenAIClientOptions();
var openAiClient = new AzureOpenAIClient(new Uri("https://models.inference.ai.azure.com"), credential, azureOptions);
var chatClient = openAiClient.GetChatClient("gpt-4o");
IHost app = builder.Build();
const string AgentPrompt = """
Objective:
You are an AI assistant helping to triage GitHub issues for the dotnet/extensions repository.
Given a string which contains a GitHub issue:
- summarise the content of the issue,
- determine whether it is a bug, feature request, or question,
- determine what actions are required from the team.
Outputs:
- In the Summary field, make sure you call out not just the summary of the issue body,
but also a quick summary of the conversation in the issue if any.
- In the Actions field, call out what actions are required for the issue. For example, if the issue is a bug,
call out that the issue needs to be triaged and assigned to a team member.
If the issue is a feature request, call out that the issue needs to be reviewed and prioritized.
- In the Type field, call out whether the issue is a bug, feature request, or question.
Output Format:
{{
"summary": "A summary of the issue body and conversation",
"actions": "Actions required for the issue",
"type": "bug" | "feature" | "question"
}}
""";
ChatMessage s_agentPrompt = ChatMessage.CreateSystemMessage(AgentPrompt);
string issueBody = "This is a bug report. The issue is that the code is not working as expected.";
ChatMessage request = ChatMessage.CreateUserMessage($"Here's GitHub issue to summarise (serialized as json): \r\n\r\n{JsonSerializer.Serialize(new { Content = issueBody })}");
var response = await chatClient.CompleteChatAsync([s_agentPrompt, request]);
Console.Write(response.Value.Content[0].Text); |
This comment has been minimized.
This comment has been minimized.
You're instantiating it in your original repro.
This is just using Azure.AI.Inference to repro the error: using Azure;
using Azure.AI.Inference;
var client = new ChatCompletionsClient(
new Uri("https://models.inference.ai.azure.com"),
new AzureKeyCredential(Environment.GetEnvironmentVariable("AI:GitHub:ApiKey")!));
await client.CompleteAsync(new ChatCompletionsOptions()
{
Model = "gpt-4o",
Messages = [new ChatRequestUserMessage("Anything")],
ResponseFormat = ChatCompletionsResponseFormat.CreateJsonFormat("MySchema",
new Dictionary<string, BinaryData>
{
{ "type", BinaryData.FromString("\"object\"") },
{ "properties", BinaryData.FromString("""{ "result": { "type": "string" } }""") },
{ "required", BinaryData.FromString("""["result"]""") },
{ "additionalProperties", BinaryData.FromString("false") }
}),
}); |
Ugh, I couldn't see Thank you, @stephentoub ! Here's a sample using Azure.AI.Inference/1.0.0-beta.3, and it is working without issues... <Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net9.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.Hosting" Version="9.0.1" />
<PackageReference Include="Azure.AI.Inference" Version="1.0.0-beta.3" />
</ItemGroup>
</Project> // See https://aka.ms/new-console-template for more information
using System.ClientModel.Primitives;
using System.Text.Json;
using Azure;
using Azure.AI.Inference;
using Microsoft.Extensions.Hosting;
Console.WriteLine("Hello, World!");
HostApplicationBuilder builder = Host.CreateApplicationBuilder();
AzureKeyCredential credential = new(Environment.GetEnvironmentVariable("GITHUB_TOKEN")!);
var azureOptions = new AzureAIInferenceClientOptions();
var chatClient = new ChatCompletionsClient(new Uri("https://models.inference.ai.azure.com"), credential, azureOptions);
IHost app = builder.Build();
const string AgentPrompt = """
Objective:
You are an AI assistant helping to triage GitHub issues for the dotnet/extensions repository.
Given a string which contains a GitHub issue:
- summarise the content of the issue,
- determine whether it is a bug, feature request, or question,
- determine what actions are required from the team.
Outputs:
- In the Summary field, make sure you call out not just the summary of the issue body,
but also a quick summary of the conversation in the issue if any.
- In the Actions field, call out what actions are required for the issue. For example, if the issue is a bug,
call out that the issue needs to be triaged and assigned to a team member.
If the issue is a feature request, call out that the issue needs to be reviewed and prioritized.
- In the Type field, call out whether the issue is a bug, feature request, or question.
Output Format:
{{
"summary": "A summary of the issue body and conversation",
"actions": "Actions required for the issue",
"type": "bug" | "feature" | "question"
}}
""";
ChatRequestSystemMessage s_agentPrompt = new(AgentPrompt);
string issueBody = "This is a bug report. The issue is that the code is not working as expected.";
ChatRequestUserMessage request = new($"Here's GitHub issue to summarise (serialized as json): \r\n\r\n{JsonSerializer.Serialize(new { Content = issueBody })}");
var requestOptions = new ChatCompletionsOptions()
{
Model = "gpt-4o",
Messages =
{
s_agentPrompt,
request,
},
};
Response<ChatCompletions> response = chatClient.Complete(requestOptions);
Console.WriteLine(response.Value.Content); [UPDATE]: The following sample, however, shows the same error as described in the OT: @@ -60,7 +60,15 @@ var requestOptions = new ChatCompletionsOptions()
{
s_agentPrompt,
request,
},
+ ResponseFormat = ChatCompletionsResponseFormat.CreateJsonFormat("MySchema",
+ new Dictionary<string, BinaryData>
+ {
+ { "type", BinaryData.FromString("\"object\"") },
+ { "properties", BinaryData.FromString("""{ "result": { "type": "string" } }""") },
+ { "required", BinaryData.FromString("""["result"]""") },
+ { "additionalProperties", BinaryData.FromString("false") }
+ }),
};
Response<ChatCompletions> response = chatClient.Complete(requestOptions);
Console.WriteLine(response.Value.Content); Kudos to @stephentoub for corrections. |
Right, because you didn't set the ResponseFormat property. If you set that, like I did in my example, then it fails, because the API version it's talking to doesn't support structured outputs. |
@RussKie -- To confirm my understanding, can this be closed with this as an issue in your consuming code? |
From a layman point of view, this feels like a problem with our libraries but @stephentoub doesn't think it's a bug... I don't know :) |
We think this should be addressed in Azure AI Inference. Nothing we can do about it in MEAI. I'll file an external issue with them. |
Description
I am unable to write a simple agent using https://models.inference.ai.azure.com as LLM.
This issue looks related to Azure/azure-sdk-for-net#46579.
Reproduction Steps
Expected behavior
It should just work (c)
Actual behavior
The samples fails with the following error:
Regression?
No response
Known Workarounds
The workaround suggested in Azure/azure-sdk-for-net#46579 (comment) by @V0v1kkk seems to overcome the issue:
That said, the error isn't clear nor finding the workaround isn't straight forward. An average Igor developer may not be so lucky...
Configuration
No response
Other information
No response
The text was updated successfully, but these errors were encountered: