Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Azure.RequestFailedException: 'response_format value as json_schema is enabled only for api versions 2024-08-01-preview and later #6130

Open
RussKie opened this issue Mar 17, 2025 · 12 comments
Assignees
Labels
area-ai Microsoft.Extensions.AI libraries untriaged

Comments

@RussKie
Copy link
Member

RussKie commented Mar 17, 2025

Description

I am unable to write a simple agent using https://models.inference.ai.azure.com as LLM.
This issue looks related to Azure/azure-sdk-for-net#46579.

Reproduction Steps

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>net9.0</TargetFramework>
    <ImplicitUsings>enable</ImplicitUsings>
    <Nullable>enable</Nullable>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.Extensions.Hosting" Version="9.0.1" />
    <PackageReference Include="Microsoft.Extensions.AI" Version="9.3.0-preview.1.25161.3" />
    <PackageReference Include="Microsoft.Extensions.AI.AzureAIInference" Version="9.3.0-preview.1.25161.3" />
  </ItemGroup>

</Project>
using System.Text.Json;
using Azure;
using Azure.AI.Inference;
using Azure.Core;
using Microsoft.Extensions.AI;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using ChatRole = Microsoft.Extensions.AI.ChatRole;

HostApplicationBuilder builder = Host.CreateApplicationBuilder();


AzureAIInferenceClientOptions options = new()
{
    Diagnostics =
    {
        IsLoggingContentEnabled = true
    },
    Retry =
    {
        MaxRetries = 3,
        Delay = TimeSpan.FromSeconds(2),
        MaxDelay = TimeSpan.FromSeconds(10),
        Mode = RetryMode.Exponential
    }
};
AzureKeyCredential credential = new(Environment.GetEnvironmentVariable("GITHUB_TOKEN")!);
builder.Services.AddSingleton(new ChatCompletionsClient(new Uri("https://models.inference.ai.azure.com"), credential, options));

builder.Services.AddChatClient(services => services.GetRequiredService<ChatCompletionsClient>().AsChatClient("gpt-4o"))
    .UseLogging();

IHost app = builder.Build();

IChatClient chatClient = app.Services.GetRequiredService<IChatClient>();

const string AgentPrompt = """
            Objective: 
            
            You are an AI assistant helping to triage GitHub issues for the dotnet/extensions repository.
            Given a string which contains a GitHub issue:
            
            - summarise the content of the issue, 
            - determine whether it is a bug, feature request, or question,
            - determine what actions are required from the team.
            

            Outputs:

            - In the Summary field, make sure you call out not just the summary of the issue body, 
              but also a quick summary of the conversation in the issue if any.

            - In the Actions field, call out what actions are required for the issue. For example, if the issue is a bug, 
              call out that the issue needs to be triaged and assigned to a team member.
              If the issue is a feature request, call out that the issue needs to be reviewed and prioritized.

            - In the Type field, call out whether the issue is a bug, feature request, or question.

            Output Format:

            {{
                "summary": "A summary of the issue body and conversation",
                "actions": "Actions required for the issue",
                "type": "bug" | "feature" | "question"
            }}

            """;
ChatMessage s_agentPrompt = new ChatMessage(ChatRole.System, AgentPrompt);

string issueBody = "This is a bug report. The issue is that the code is not working as expected.";

ChatMessage request = new(ChatRole.User, $"Here's GitHub issue to summarise (serialized as json): \r\n\r\n{JsonSerializer.Serialize(new { Content = issueBody })}");
ChatResponse<string> response = await chatClient.GetResponseAsync<string>([s_agentPrompt, request], useNativeJsonSchema: true, cancellationToken: CancellationToken.None);

Expected behavior

It should just work (c)

Actual behavior

The samples fails with the following error:

Azure.RequestFailedException
  HResult=0x80131500
  Message=response_format value as json_schema is enabled only for api versions 2024-08-01-preview and later
Status: 400 (Bad Request)
ErrorCode: BadRequest

Content:
{"error":{"code":"BadRequest","message":"response_format value as json_schema is enabled only for api versions 2024-08-01-preview and later"}}

  Source=Azure.AI.Inference
  StackTrace:
   at Azure.Core.HttpPipelineExtensions.<ProcessMessageAsync>d__0.MoveNext()
   at System.Runtime.CompilerServices.ConfiguredValueTaskAwaitable`1.ConfiguredValueTaskAwaiter.GetResult()
   at Azure.AI.Inference.ChatCompletionsClient.<CompleteAsync>d__5.MoveNext()

Regression?

No response

Known Workarounds

The workaround suggested in Azure/azure-sdk-for-net#46579 (comment) by @V0v1kkk seems to overcome the issue:

    AzureAIInferenceClientOptions options = new()
    {
        Diagnostics =
        {
            IsLoggingContentEnabled = true
        },
        Retry =
        {
            MaxRetries = 3,
            Delay = TimeSpan.FromSeconds(2),
            MaxDelay = TimeSpan.FromSeconds(10),
            Mode = RetryMode.Exponential
        }
    };
+   SetCustomVersion(options, "2024-08-01-preview");


+   void SetCustomVersion(AzureAIInferenceClientOptions options, string customVersion)
+   {
+       Type optionsType = typeof(AzureAIInferenceClientOptions);
+       FieldInfo versionField = optionsType.GetField("<Version>k__BackingField", BindingFlags.Instance | BindingFlags.NonPublic);
+       if (versionField != null)
+       {
+           versionField.SetValue(options, customVersion);
+       }
+       else
+       {
+           throw new InvalidOperationException("Unable to find the Version backing field.");
+       }
+   }

That said, the error isn't clear nor finding the workaround isn't straight forward. An average Igor developer may not be so lucky...

Configuration

No response

Other information

No response

@RussKie RussKie added area-ai Microsoft.Extensions.AI libraries bug This issue describes a behavior which is not expected - a bug. untriaged labels Mar 17, 2025
@jeffhandley
Copy link
Member

Needs investigation to find root cause

@jeffhandley
Copy link
Member

@RussKie Does this repro when not using MEAI, but using A.AI.Inference directly?

@jeffhandley jeffhandley assigned jeffhandley and jozkee and unassigned jeffhandley Mar 18, 2025
@jeffhandley
Copy link
Member

Assigning to @jozkee for investigation. Let's see if @RussKie is able to indicate if this same problem occurs using Azure.AI.Inference directly.

@RussKie
Copy link
Member Author

RussKie commented Mar 19, 2025

I tried with Azure.AI.OpenAI (2.0.0-2.2.0-beta.2), and I get no issues

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>net9.0</TargetFramework>
    <ImplicitUsings>enable</ImplicitUsings>
    <Nullable>enable</Nullable>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.Extensions.Hosting" Version="9.0.1" />
    <PackageReference Include="Azure.AI.OpenAI" Version="2.2.0-beta.2" />
  </ItemGroup>

</Project>
// See https://aka.ms/new-console-template for more information
using System.Text.Json;
using Azure;
using Azure.AI.OpenAI;
using Microsoft.Extensions.Hosting;
using OpenAI.Chat;

Console.WriteLine("Hello, World!");

HostApplicationBuilder builder = Host.CreateApplicationBuilder();

AzureKeyCredential credential = new(Environment.GetEnvironmentVariable("GITHUB_TOKEN")!);


var azureOptions = new AzureOpenAIClientOptions();
var openAiClient = new AzureOpenAIClient(new Uri("https://models.inference.ai.azure.com"), credential, azureOptions);
var chatClient = openAiClient.GetChatClient("gpt-4o");

IHost app = builder.Build();

const string AgentPrompt = """
            Objective: 
            
            You are an AI assistant helping to triage GitHub issues for the dotnet/extensions repository.
            Given a string which contains a GitHub issue:
            
            - summarise the content of the issue, 
            - determine whether it is a bug, feature request, or question,
            - determine what actions are required from the team.
            

            Outputs:

            - In the Summary field, make sure you call out not just the summary of the issue body, 
              but also a quick summary of the conversation in the issue if any.

            - In the Actions field, call out what actions are required for the issue. For example, if the issue is a bug, 
              call out that the issue needs to be triaged and assigned to a team member.
              If the issue is a feature request, call out that the issue needs to be reviewed and prioritized.

            - In the Type field, call out whether the issue is a bug, feature request, or question.

            Output Format:

            {{
                "summary": "A summary of the issue body and conversation",
                "actions": "Actions required for the issue",
                "type": "bug" | "feature" | "question"
            }}

            """;
ChatMessage s_agentPrompt = ChatMessage.CreateSystemMessage(AgentPrompt);

string issueBody = "This is a bug report. The issue is that the code is not working as expected.";

ChatMessage request = ChatMessage.CreateUserMessage($"Here's GitHub issue to summarise (serialized as json): \r\n\r\n{JsonSerializer.Serialize(new { Content = issueBody })}");
var response = await chatClient.CompleteChatAsync([s_agentPrompt, request]);
Console.Write(response.Value.Content[0].Text);

@RussKie

This comment has been minimized.

@stephentoub
Copy link
Member

I can't figure out how to instantiate ChatCompletionsClient

You're instantiating it in your original repro.

builder.Services.AddSingleton(new ChatCompletionsClient(new Uri("https://models.inference.ai.azure.com"), credential, options));

I can't provide a sample

This is just using Azure.AI.Inference to repro the error:

using Azure;
using Azure.AI.Inference;

var client = new ChatCompletionsClient(
    new Uri("https://models.inference.ai.azure.com"), 
    new AzureKeyCredential(Environment.GetEnvironmentVariable("AI:GitHub:ApiKey")!));

await client.CompleteAsync(new ChatCompletionsOptions()
{
    Model = "gpt-4o",
    Messages = [new ChatRequestUserMessage("Anything")],
    ResponseFormat = ChatCompletionsResponseFormat.CreateJsonFormat("MySchema",
        new Dictionary<string, BinaryData>
        {
            { "type", BinaryData.FromString("\"object\"") },
            { "properties", BinaryData.FromString("""{ "result": { "type": "string" } }""") },
            { "required", BinaryData.FromString("""["result"]""") },
            { "additionalProperties", BinaryData.FromString("false") }
        }),
});

@stephentoub stephentoub removed the bug This issue describes a behavior which is not expected - a bug. label Mar 19, 2025
@RussKie
Copy link
Member Author

RussKie commented Mar 19, 2025

Ugh, I couldn't see Model property on ChatCompletionsOptions. Hitting F12 that property isn't there...

Thank you, @stephentoub !

Here's a sample using Azure.AI.Inference/1.0.0-beta.3, and it is working without issues...

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>net9.0</TargetFramework>
    <ImplicitUsings>enable</ImplicitUsings>
    <Nullable>enable</Nullable>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.Extensions.Hosting" Version="9.0.1" />
    <PackageReference Include="Azure.AI.Inference" Version="1.0.0-beta.3" />
  </ItemGroup>

</Project>
// See https://aka.ms/new-console-template for more information
using System.ClientModel.Primitives;
using System.Text.Json;
using Azure;
using Azure.AI.Inference;
using Microsoft.Extensions.Hosting;

Console.WriteLine("Hello, World!");

HostApplicationBuilder builder = Host.CreateApplicationBuilder();

AzureKeyCredential credential = new(Environment.GetEnvironmentVariable("GITHUB_TOKEN")!);


var azureOptions = new AzureAIInferenceClientOptions();
var chatClient = new ChatCompletionsClient(new Uri("https://models.inference.ai.azure.com"), credential, azureOptions);

IHost app = builder.Build();

const string AgentPrompt = """
            Objective: 
            
            You are an AI assistant helping to triage GitHub issues for the dotnet/extensions repository.
            Given a string which contains a GitHub issue:
            
            - summarise the content of the issue, 
            - determine whether it is a bug, feature request, or question,
            - determine what actions are required from the team.
            

            Outputs:

            - In the Summary field, make sure you call out not just the summary of the issue body, 
              but also a quick summary of the conversation in the issue if any.

            - In the Actions field, call out what actions are required for the issue. For example, if the issue is a bug, 
              call out that the issue needs to be triaged and assigned to a team member.
              If the issue is a feature request, call out that the issue needs to be reviewed and prioritized.

            - In the Type field, call out whether the issue is a bug, feature request, or question.

            Output Format:

            {{
                "summary": "A summary of the issue body and conversation",
                "actions": "Actions required for the issue",
                "type": "bug" | "feature" | "question"
            }}

            """;
ChatRequestSystemMessage s_agentPrompt = new(AgentPrompt);

string issueBody = "This is a bug report. The issue is that the code is not working as expected.";
ChatRequestUserMessage request = new($"Here's GitHub issue to summarise (serialized as json): \r\n\r\n{JsonSerializer.Serialize(new { Content = issueBody })}");

var requestOptions = new ChatCompletionsOptions()
{
    Model = "gpt-4o",
    Messages =
    {
        s_agentPrompt,
        request,
    }, 
};
Response<ChatCompletions> response = chatClient.Complete(requestOptions);
Console.WriteLine(response.Value.Content);

[UPDATE]:

The following sample, however, shows the same error as described in the OT:

@@ -60,7 +60,15 @@ var requestOptions = new ChatCompletionsOptions()
     {
         s_agentPrompt,
         request,
     }, 
+    ResponseFormat = ChatCompletionsResponseFormat.CreateJsonFormat("MySchema",
+        new Dictionary<string, BinaryData>
+        {
+            { "type", BinaryData.FromString("\"object\"") },
+            { "properties", BinaryData.FromString("""{ "result": { "type": "string" } }""") },
+            { "required", BinaryData.FromString("""["result"]""") },
+            { "additionalProperties", BinaryData.FromString("false") }
+        }),
 };
 Response<ChatCompletions> response = chatClient.Complete(requestOptions);
 Console.WriteLine(response.Value.Content);

Kudos to @stephentoub for corrections.

@stephentoub
Copy link
Member

and it is working without issues..

Right, because you didn't set the ResponseFormat property. If you set that, like I did in my example, then it fails, because the API version it's talking to doesn't support structured outputs.

@RussKie
Copy link
Member Author

RussKie commented Mar 19, 2025

Yes, you're absolutely right:

Image

I updated the sample above.

@jeffhandley
Copy link
Member

@RussKie -- To confirm my understanding, can this be closed with this as an issue in your consuming code?

@RussKie
Copy link
Member Author

RussKie commented Mar 24, 2025

From a layman point of view, this feels like a problem with our libraries but @stephentoub doesn't think it's a bug... I don't know :)
If you don't think it's a bug or is a bug worth fixing, then feel free to close.

@jeffhandley
Copy link
Member

We think this should be addressed in Azure AI Inference. Nothing we can do about it in MEAI. I'll file an external issue with them.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area-ai Microsoft.Extensions.AI libraries untriaged
Projects
None yet
Development

No branches or pull requests

4 participants