Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,9 @@ public static partial class AIOpenAIModelFactory
}
public static partial class AzureOpenAIModelFactory
{
public static Azure.AI.OpenAI.ChatChoice ChatChoice(Azure.AI.OpenAI.ChatMessage message = null, int index = 0, Azure.AI.OpenAI.CompletionsFinishReason reason = default(Azure.AI.OpenAI.CompletionsFinishReason)) { throw null; }
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we know the reason why these are not being auto-generated?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would appear that having some custom code for these types is suppressing the auto-generation

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that does seem to be the case; there are at least some interesting inconsistencies that crop up. E.g. in the big PR I just published for all the new capabilities, I don't think we got any auto-generated coverage for Chat Functions and only limited coverage for Image Generation: https://github.com/Azure/azure-sdk-for-net/pull/37539/files#diff-32e4c78ddf8afea7d023329cfde10e80cc9c1151343b23db73a8a4c9baa41e11

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. I suspect that the root cause of this are the properties on these classes that have been customized to be internal, which makes sense, because a user wouldn't know what these values are or how to use them, but the generator cannot simply ignore them either.

Let's try a few things:
1️⃣ For the ChatCompletions class, we can remove the customization entirely now that this bug in the generator has been fixed:
🔗 Azure/autorest.csharp#3492

The fix must be done in the definition of the ChatCompletions model in TypeSpec, where we have the "created" property defined as an int32 that we manually convert to a DateTimeOffset in .NET:

@projectedName("json", "created")
@projectedName("csharp", "InternalCreatedSecondsAfterUnixEpoch")
created: int32;

We should re-define the above as a utcDateTime and specify that it is encoded as a Unix timestamp using the @encode decorator like this:

@projectedName("json", "created")
@encode(DateTimeKnownEncoding.unixTimestamp, int32)
created: utcDateTime;

We can then delete the custom ChatCompletions.cs and re-generate the library. This will correctly auto-generate the Created property of the ChatCompletions class as a DateTimeOffset, and in addition, it will also auto-generate the model factory method that we're trying to manually add as part of this PR.

2️⃣ The "created" property of the Completions model in the TypeSpec has the exact same problem, and the fix is the exact same too: The property must be fixed in the TypeSpec, changing it to a utcDateTime with @encode(DateTimeKnownEncoding.unixTimestamp, int32), and re-generating.

3️⃣ The case with the ChatChoice class is a little trickier because customizing the property as internal is by design. This means we actually have to write this model factory method manually, but I have a few suggestions on how to do it: First, we only need one model factory method per class, and it should have parameters for all the properties. In other words, we can remove the second method for ChatChoice (see line 15), which does not have the ChatMessage parameter. Second, I believe this means that we also need a model factory method for the streaming classes. In this case, we would also need one for StreamingChatChoice, which is where that internal InternalStreamingDeltaMessage property is used. For more info on how to write model factories, check out our guidelines:
🔗 https://azure.github.io/azure-sdk/dotnet_introduction.html#dotnet-mocking-factory-builder

public static Azure.AI.OpenAI.ChatChoice ChatChoice(int index = 0, Azure.AI.OpenAI.CompletionsFinishReason finishReason = default(Azure.AI.OpenAI.CompletionsFinishReason)) { throw null; }
public static Azure.AI.OpenAI.ChatCompletions ChatCompletions(string id = null, System.DateTimeOffset created = default(System.DateTimeOffset), System.Collections.Generic.IEnumerable<Azure.AI.OpenAI.ChatChoice> choices = null, Azure.AI.OpenAI.CompletionsUsage usage = null) { throw null; }
public static Azure.AI.OpenAI.Choice Choice(string text = null, int index = 0, Azure.AI.OpenAI.CompletionsLogProbabilityModel logProbabilityModel = null, Azure.AI.OpenAI.CompletionsFinishReason finishReason = default(Azure.AI.OpenAI.CompletionsFinishReason)) { throw null; }
}
public partial class ChatChoice
Expand Down
3 changes: 3 additions & 0 deletions sdk/openai/Azure.AI.OpenAI/src/Azure.AI.OpenAI.csproj
Original file line number Diff line number Diff line change
Expand Up @@ -24,5 +24,8 @@
<Compile Include="$(AzureCoreSharedSources)AzureKeyCredentialPolicy.cs" LinkBase="Shared" />
<Compile Include="$(AzureCoreSharedSources)AzureResourceProviderNamespaceAttribute.cs" LinkBase="Shared" />
</ItemGroup>
<ItemGroup>
<None Include="..\tsp-location.yaml" Link="tsp-location.yaml" />
</ItemGroup>

</Project>
Original file line number Diff line number Diff line change
@@ -1,13 +1,10 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Licensed under the MIT License.

// <auto-generated/>

#nullable disable

using System;
using System.Collections.Generic;
using System.Linq;

namespace Azure.AI.OpenAI
{
Expand All @@ -31,5 +28,46 @@ public static Choice Choice(string text = null, int index = default, Completions

return new Choice(text, index, logProbabilityModel, finishReason);
}

/// <summary> Initializes a new instance of ChatChoice. </summary>
/// <param name="index"> The ordered index associated with this chat completions choice. </param>
/// <param name="finishReason"> The reason that this chat completions choice completed its generated. </param>
/// <returns> A new <see cref="OpenAI.ChatChoice"/> instance for mocking. </returns>
public static ChatChoice ChatChoice(int index = default, CompletionsFinishReason finishReason = default)
{
return new ChatChoice(index, finishReason);
}

/// <summary> Initializes a new instance of ChatChoice. </summary>
/// <param name="message"> The chat message for a given chat completions prompt. </param>
/// <param name="index"> The ordered index associated with this chat completions choice. </param>
/// <param name="reason"> The reason that this chat completions choice completed its generated. </param>
/// <returns> A new <see cref="OpenAI.ChatChoice"/> instance for mocking. </returns>
public static ChatChoice ChatChoice(ChatMessage message = null, int index = default, CompletionsFinishReason reason = default)
{
return new ChatChoice(message, index, reason, null);
}

/// <summary> Initializes a new instance of ChatCompletions. </summary>
/// <param name="id"> A unique identifier associated with this chat completions response. </param>
/// <param name="created"> The first timestamp associated with generation activity for this completions response. </param>
/// <param name="choices"> The collection of completions choices associated with this completions response. </param>
/// <param name="usage"> Usage information for tokens processed and generated as part of this completions operation. </param>
/// <returns> A new <see cref="OpenAI.ChatCompletions"/> instance for mocking. </returns>
public static ChatCompletions ChatCompletions(string id = null, DateTimeOffset created = default(DateTimeOffset), IEnumerable<ChatChoice> choices = null, CompletionsUsage usage = null)
{
choices ??= new List<ChatChoice>();
usage ??= AIOpenAIModelFactory.CompletionsUsage();

long constrainedUnixTimeInSec = Math.Max(
Math.Min(created.ToUnixTimeSeconds(), int.MaxValue),
int.MinValue);

return new ChatCompletions(
id: id ?? string.Empty,
internalCreatedSecondsAfterUnixEpoch: (int)constrainedUnixTimeInSec,
choices: choices,
usage: usage);
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
<TargetFrameworks>$(RequiredTargetFrameworks)</TargetFrameworks>

<!-- We don't care about XML doc comments on test types and members -->
<NoWarn>$(NoWarn);CS1591</NoWarn>
<NoWarn>$(NoWarn);CS1591;SA1508;SA1507;SA1505</NoWarn>
</PropertyGroup>

<!-- Reference the Client Library -->
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,97 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Licensed under the MIT License.

using System;
using System.Linq;
using NUnit.Framework;

namespace Azure.AI.OpenAI.Tests
{
[TestFixture]
public class OpenAIInferenceModelFactoryTests
{
[Test]
public void TestCompletionsLogProbabilityModel()
{
var logProbabilityModel = AIOpenAIModelFactory.CompletionsLogProbabilityModel(
new[] { "one", "two" },
new float?[] { 0.9f, 0.72f });
Assert.That(logProbabilityModel, Is.Not.Null);
Assert.That(logProbabilityModel.Tokens.Count, Is.EqualTo(2));
Assert.That(logProbabilityModel.Tokens[0], Is.EqualTo("one"));
Assert.That(logProbabilityModel.Tokens[1], Is.EqualTo("two"));
Assert.That(logProbabilityModel.TokenLogProbabilities.Count, Is.EqualTo(2));
Assert.That(logProbabilityModel.TokenLogProbabilities[0], Is.EqualTo(0.9F).Within(2).Percent);
Assert.That(logProbabilityModel.TokenLogProbabilities[1], Is.EqualTo(0.72F).Within(2).Percent);
Assert.That(logProbabilityModel.TopLogProbabilities, Is.Empty);
Assert.That(logProbabilityModel.TextOffsets, Is.Empty);
}

[Test]
public void TestChatChoices()
{
var expectedChoices = new[]
{
new { role = ChatRole.Assistant, text = "First one", index = 0, reason = CompletionsFinishReason.ContentFiltered },
new { role = ChatRole.System, text = "Second one", index = -1, reason = CompletionsFinishReason.Stopped },
new { role = ChatRole.User, text = "Final one", index = 3, reason = CompletionsFinishReason.TokenLimitReached },
};

var chatChoices = expectedChoices
.Select(e => AzureOpenAIModelFactory.ChatChoice(
new ChatMessage(e.role, e.text),
e.index,
e.reason))
.ToArray();
Assert.That(chatChoices, Is.All.Not.Null);

for (int i = 0; i < chatChoices.Length; i++)
{
var actual = chatChoices[i];
var expected = expectedChoices[i];

Assert.That(actual.Message, Is.Not.Null);
Assert.That(actual.Message.Role, Is.EqualTo(expected.role));
Assert.That(actual.Message.Content, Is.EqualTo(expected.text));
Assert.That(actual.Index, Is.EqualTo(expected.index));
Assert.That(actual.FinishReason, Is.EqualTo(expected.reason));
}
}

[Test]
public void TestChatCompletions()
{
string expectedId = Guid.NewGuid().ToString();
DateTimeOffset expectedCreationTime = DateTimeOffset.Now;

var expectedChoices = new[]
{
new { role = ChatRole.Assistant, text = "First one", index = 0, reason = CompletionsFinishReason.ContentFiltered },
new { role = ChatRole.System, text = "Second one", index = -1, reason = CompletionsFinishReason.Stopped },
new { role = ChatRole.User, text = "Final one", index = 3, reason = CompletionsFinishReason.TokenLimitReached },
};

var chatChoices = expectedChoices
.Select(e => AzureOpenAIModelFactory.ChatChoice(
new ChatMessage(e.role, e.text),
e.index,
e.reason))
.ToArray();

var chatCompletions = AzureOpenAIModelFactory.ChatCompletions(
expectedId,
expectedCreationTime,
chatChoices,
AIOpenAIModelFactory.CompletionsUsage(2, 5, 7));

Assert.That(chatCompletions, Is.Not.Null);
Assert.That(chatCompletions.Id, Is.EqualTo(expectedId));
Assert.That(chatCompletions.Created, Is.EqualTo(expectedCreationTime).Within(TimeSpan.FromSeconds(1))); // Internally we use Unix time with second precision
Assert.That(chatCompletions.Choices, Is.EquivalentTo(chatChoices));
Assert.That(chatCompletions.Usage, Is.Not.Null);
Assert.That(chatCompletions.Usage.CompletionTokens, Is.EqualTo(2));
Assert.That(chatCompletions.Usage.PromptTokens, Is.EqualTo(5));
Assert.That(chatCompletions.Usage.TotalTokens, Is.EqualTo(7));
}
}
}