Skip to content

Commit f9169e7

Browse files
DavidLuong98LittleLittleCloudmhensenekzhuKrzysztof318
authored
[.NET] Return ChatCompletions instead of ChatResponseMessage for token usage. (microsoft#2545)
* update * update * update * update * update * add sample project * revert notebook change back * update * update interactive version * add nuget package * refactor Message * update example * add azure nightly build pipeline * Set up CI with Azure Pipelines [skip ci] * Update nightly-build.yml for Azure Pipelines * add dotnet interactive package * add dotnet interactive package * update pipeline * add nuget feed back * remove dotnet-tool feed * remove dotnet-tool feed comment * update pipeline * update build name * Update nightly-build.yml * Delete .github/workflows/dotnet-ci.yml * update * add working_dir to use step * add initateChat api * update oai package * Update dotnet-build.yml * Update dotnet-run-openai-test-and-notebooks.yml * update build workflow * update build workflow * update nuget feed * update nuget feed * update aoai and sk version * Update InteractiveService.cs * add support for GPT 4V * add DalleAndGPT4V example * update example * add user proxy agent * add readme * bump version * update example * add dotnet interactive hook * update * udpate tests * add website * update index.md * add docs * update doc * move sk dependency out of core package * udpate doc * Update Use-function-call.md * add type safe function call document * update doc * update doc * add dock * Update Use-function-call.md * add GenerateReplyOptions * remove IChatLLM * update version * update doc * update website * add sample * fix link * add middleware agent * clean up doc * bump version * update doc * update * add Other Language * remove warnings * add sign.props * add sign step * fix pipelien * auth * real sign * disable PR trigger * update * disable PR trigger * use microbuild machine * update build pipeline to add publish to internal feed * add internal feed * fix build pipeline * add dotnet prefix * update ci * add build number * update run number * update source * update token * update * remove adding source * add publish to github package * try again * try again * ask for write pacakge * disable package when branch is not main * update * implement streaming agent * add test for streaming function call * update * fix microsoft#1588 * enable PR check for dotnet branch * add website readme * only publish to dotnet feed when pushing to dotnet branch * remove openai-test-and-notebooks workflow * update readme * update readme * update workflow * update getting-start * upgrade test and sample proejct to use .net 8 * fix global.json format && make loadFromConfig API internal only before implementing * update * add support for LM studio * add doc * Update README.md * add push and workflow_dispatch trigger * disable PR for main * add dotnet env * Update Installation.md * add nuget * refer to newtonsoft 13 * update branch to dotnet in docfx * Update Installation.md * pull out HumanInputMiddleware and FunctionCallMiddleware * fix tests * add link to sample folder * refactor message * refactor over IMessage * add more tests * add more test * fix build error * rename header * add semantic kernel project * update sk example * update dotnet version * add LMStudio function call example * rename LLaMAFunctin * remove dotnet run openai test and notebook workflow * add FunctionContract and test * update doc * add documents * add workflow * update * update sample * fix warning in test * reult length can be less then maximumOutputToKeep (microsoft#1804) * merge with main * add option to retrieve inner agent and middlewares from MiddlewareAgent * update doc * adjust namespace * update readme * fix test * use IMessage * more updates * update * fix test * add comments * use FunctionContract to replace FunctionDefinition * move AutoGen contrac to AutoGen.Core * update installation * refactor streamingAgent by adding StreamingMessage type * update sample * update samples * update * update * add test * fix test * bump version * add openaichat test * update * Update Example03_Agent_FunctionCall.cs * [.Net] improve docs (microsoft#1862) * add doc * add doc * add doc * add doc * add doc * add doc * update * fix test error * fix some error * fix test * fix test * add more tests * edits --------- Co-authored-by: ekzhu <[email protected]> * [.Net] Add fill form example (microsoft#1911) * add form filler example * update * fix ci error * [.Net] Add using AutoGen.Core in source generator (microsoft#1983) * fix using namespace bug in source generator * remove using in sourcegenerator test * disable PR test * Add .idea to .gitignore (microsoft#1988) * [.Net] publish to nuget.org feed (microsoft#1987) * publish to nuget * update ci * update dotnet-release * update release pipeline * add source * remove empty symbol package * update pipeline * remove tag * update installation guide * [.Net] Rename some classes && APIs based on doc review (microsoft#1980) * rename sequential group chat to round robin group chat * rename to sendInstruction * rename workflow to graph * rename some api * bump version * move Graph to GroupChat folder * rename fill application example * [.Net] Improve package description (microsoft#2161) * add discord link and update package description * Update getting-start.md * [.Net] Fix document comment from the most recent AutoGen.Net engineer sync (microsoft#2231) * update * rename RegisterPrintMessageHook to RegisterPrintMessage * update website * update update.md * fix link error * [.Net] Enable JsonMode and deterministic output in AutoGen.OpenAI OpenAIChatAgent (microsoft#2347) * update openai version && add sample for json output * add example in web * update update.md * update image url * [.Net] Add AutoGen.Mistral package (microsoft#2330) * add mstral client * enable streaming support * add mistralClientAgent * add test for function call * add extension * add support for toolcall and toolcall result message * add support for aggregate message * implement streaming function call * track (microsoft#2471) * [.Net] add mistral example (microsoft#2482) * update existing examples to use messageCOnnector * add overview * add function call document * add example 14 * add mistral token count usage example * update version * Update dotnet-release.yml (microsoft#2488) * update * revert gitattributes * Return ChatCompletions instead of ChatResponseMessage for token usage. --------- Co-authored-by: XiaoYun Zhang <[email protected]> Co-authored-by: Xiaoyun Zhang <[email protected]> Co-authored-by: mhensen <[email protected]> Co-authored-by: ekzhu <[email protected]> Co-authored-by: Krzysztof Kasprowicz <[email protected]> Co-authored-by: luongdavid <[email protected]>
1 parent 1cd13bb commit f9169e7

File tree

3 files changed

+20
-9
lines changed

3 files changed

+20
-9
lines changed

dotnet/src/AutoGen.OpenAI/Agent/OpenAIChatAgent.cs

+2-2
Original file line numberDiff line numberDiff line change
@@ -84,7 +84,7 @@ public async Task<IMessage> GenerateReplyAsync(
8484
var settings = this.CreateChatCompletionsOptions(options, messages);
8585
var reply = await this.openAIClient.GetChatCompletionsAsync(settings, cancellationToken);
8686

87-
return new MessageEnvelope<ChatResponseMessage>(reply.Value.Choices.First().Message, from: this.Name);
87+
return new MessageEnvelope<ChatCompletions>(reply, from: this.Name);
8888
}
8989

9090
public Task<IAsyncEnumerable<IStreamingMessage>> GenerateStreamingReplyAsync(
@@ -101,7 +101,7 @@ private async IAsyncEnumerable<IStreamingMessage> StreamingReplyAsync(
101101
[EnumeratorCancellation] CancellationToken cancellationToken = default)
102102
{
103103
var settings = this.CreateChatCompletionsOptions(options, messages);
104-
var response = await this.openAIClient.GetChatCompletionsStreamingAsync(settings);
104+
var response = await this.openAIClient.GetChatCompletionsStreamingAsync(settings, cancellationToken);
105105
await foreach (var update in response.WithCancellation(cancellationToken))
106106
{
107107
if (update.ChoiceIndex > 0)

dotnet/src/AutoGen.OpenAI/Middleware/OpenAIChatRequestMessageConnector.cs

+14-4
Original file line numberDiff line numberDiff line change
@@ -98,6 +98,7 @@ public IMessage PostProcessMessage(IMessage message)
9898
Message => message,
9999
AggregateMessage<ToolCallMessage, ToolCallResultMessage> => message,
100100
IMessage<ChatResponseMessage> m => PostProcessMessage(m),
101+
IMessage<ChatCompletions> m => PostProcessMessage(m),
101102
_ => throw new InvalidOperationException("The type of message is not supported. Must be one of TextMessage, ImageMessage, MultiModalMessage, ToolCallMessage, ToolCallResultMessage, Message, IMessage<ChatRequestMessage>, AggregateMessage<ToolCallMessage, ToolCallResultMessage>"),
102103
};
103104
}
@@ -129,15 +130,24 @@ public IMessage PostProcessMessage(IMessage message)
129130

130131
private IMessage PostProcessMessage(IMessage<ChatResponseMessage> message)
131132
{
132-
var chatResponseMessage = message.Content;
133+
return PostProcessMessage(message.Content, message.From);
134+
}
135+
136+
private IMessage PostProcessMessage(IMessage<ChatCompletions> message)
137+
{
138+
return PostProcessMessage(message.Content.Choices[0].Message, message.From);
139+
}
140+
141+
private IMessage PostProcessMessage(ChatResponseMessage chatResponseMessage, string? from)
142+
{
133143
if (chatResponseMessage.Content is string content)
134144
{
135-
return new TextMessage(Role.Assistant, content, message.From);
145+
return new TextMessage(Role.Assistant, content, from);
136146
}
137147

138148
if (chatResponseMessage.FunctionCall is FunctionCall functionCall)
139149
{
140-
return new ToolCallMessage(functionCall.Name, functionCall.Arguments, message.From);
150+
return new ToolCallMessage(functionCall.Name, functionCall.Arguments, from);
141151
}
142152

143153
if (chatResponseMessage.ToolCalls.Where(tc => tc is ChatCompletionsFunctionToolCall).Any())
@@ -148,7 +158,7 @@ private IMessage PostProcessMessage(IMessage<ChatResponseMessage> message)
148158

149159
var toolCalls = functionToolCalls.Select(tc => new ToolCall(tc.Name, tc.Arguments));
150160

151-
return new ToolCallMessage(toolCalls, message.From);
161+
return new ToolCallMessage(toolCalls, from);
152162
}
153163

154164
throw new InvalidOperationException("Invalid ChatResponseMessage");

dotnet/test/AutoGen.Tests/OpenAIChatAgentTest.cs

+4-3
Original file line numberDiff line numberDiff line change
@@ -41,9 +41,10 @@ public async Task BasicConversationTestAsync()
4141
var chatMessageContent = MessageEnvelope.Create(new ChatRequestUserMessage("Hello"));
4242
var reply = await openAIChatAgent.SendAsync(chatMessageContent);
4343

44-
reply.Should().BeOfType<MessageEnvelope<ChatResponseMessage>>();
45-
reply.As<MessageEnvelope<ChatResponseMessage>>().From.Should().Be("assistant");
46-
reply.As<MessageEnvelope<ChatResponseMessage>>().Content.Role.Should().Be(ChatRole.Assistant);
44+
reply.Should().BeOfType<MessageEnvelope<ChatCompletions>>();
45+
reply.As<MessageEnvelope<ChatCompletions>>().From.Should().Be("assistant");
46+
reply.As<MessageEnvelope<ChatCompletions>>().Content.Choices.First().Message.Role.Should().Be(ChatRole.Assistant);
47+
reply.As<MessageEnvelope<ChatCompletions>>().Content.Usage.TotalTokens.Should().BeGreaterThan(0);
4748

4849
// test streaming
4950
var streamingReply = await openAIChatAgent.GenerateStreamingReplyAsync(new[] { chatMessageContent });

0 commit comments

Comments
 (0)