Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[.Net] Fix 2652 && 2650 #2655

Merged
merged 1 commit into from
May 10, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions dotnet/website/articles/Installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,12 @@

AutoGen.Net provides the following packages, you can choose to install one or more of them based on your needs:

> [!Note]
> The `AutoGen.DotnetInteractive` has a dependency on `Microsoft.DotNet.Interactive.VisualStudio` which is not available on nuget.org. To restore the dependency, you need to add the following package source to your project:
> ```bash
> https://pkgs.dev.azure.com/dnceng/public/_packaging/dotnet-tools/nuget/v3/index.json
> ```

- `AutoGen`: The one-in-all package. This package has dependencies over `AutoGen.Core`, `AutoGen.OpenAI`, `AutoGen.LMStudio`, `AutoGen.SemanticKernel` and `AutoGen.SourceGenerator`.
- `AutoGen.Core`: The core package, this package provides the abstraction for message type, agent and group chat.
- `AutoGen.OpenAI`: This package provides the integration agents over openai models.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ The following example shows how to create a `MistralAITokenCounterMiddleware` @A
To collect the token usage for the entire chat session, one easy solution is simply collect all the responses from agent and sum up the token usage for each response. To collect all the agent responses, we can create a middleware which simply saves all responses to a list and register it with the agent. To get the token usage information for each response, because in the example we are using @AutoGen.Mistral.MistralClientAgent, we can simply get the token usage from the response object.

> [!NOTE]
> You can find the complete example in the [Example13_OpenAIAgent_JsonMode](https://github.com/microsoft/autogen/tree/dotnet/dotnet/sample/AutoGen.BasicSamples/Example14_MistralClientAgent_TokenCount.cs).
> You can find the complete example in the [Example13_OpenAIAgent_JsonMode](https://github.com/microsoft/autogen/tree/main/dotnet/sample/AutoGen.BasicSamples/Example14_MistralClientAgent_TokenCount.cs).

- Step 1: Adding using statement
[!code-csharp[](../../sample/AutoGen.BasicSamples/Example14_MistralClientAgent_TokenCount.cs?name=using_statements)]
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
The following example shows how to connect to third-party OpenAI API using @AutoGen.OpenAI.OpenAIChatAgent.

> [!NOTE]
> You can find the complete code of this example in [Example16_OpenAIChatAgent_ConnectToThirdPartyBackend](https://github.com/microsoft/autogen/tree/dotnet/dotnet/sample/AutoGen.BasicSamples/Example16_OpenAIChatAgent_ConnectToThirdPartyBackend.cs).
> You can find the complete code of this example in [Example16_OpenAIChatAgent_ConnectToThirdPartyBackend](https://github.com/microsoft/autogen/tree/main/dotnet/sample/AutoGen.BasicSamples/Example16_OpenAIChatAgent_ConnectToThirdPartyBackend.cs).

## Overview
A lot of LLM applications/platforms support spinning up a chat server that is compatible with OpenAI API, such as LM Studio, Ollama, Mistral etc. This means that you can connect to these servers using the @AutoGen.OpenAI.OpenAIChatAgent.
Expand Down
2 changes: 1 addition & 1 deletion dotnet/website/articles/OpenAIChatAgent-use-json-mode.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ JSON mode is a new feature in OpenAI which allows you to instruct model to alway
## How to enable JSON mode in OpenAIChatAgent.

> [!NOTE]
> You can find the complete example in the [Example13_OpenAIAgent_JsonMode](https://github.com/microsoft/autogen/tree/dotnet/dotnet/sample/AutoGen.BasicSamples/Example13_OpenAIAgent_JsonMode.cs).
> You can find the complete example in the [Example13_OpenAIAgent_JsonMode](https://github.com/microsoft/autogen/tree/main/dotnet/sample/AutoGen.BasicSamples/Example13_OpenAIAgent_JsonMode.cs).

To enable JSON mode for @AutoGen.OpenAI.OpenAIChatAgent, set `responseFormat` to `ChatCompletionsResponseFormat.JsonObject` when creating the agent. Note that when enabling JSON mode, you also need to instruct the agent to output JSON format in its system message.

Expand Down
6 changes: 6 additions & 0 deletions dotnet/website/articles/Run-dotnet-code.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,12 @@ For example, in data analysis scenario, agent can resolve tasks like "What is th
## How to run dotnet code snippet?
The built-in feature of running dotnet code snippet is provided by [dotnet-interactive](https://github.com/dotnet/interactive). To run dotnet code snippet, you need to install the following package to your project, which provides the intergraion with dotnet-interactive:

> [!Note]
> The `AutoGen.DotnetInteractive` has a dependency on `Microsoft.DotNet.Interactive.VisualStudio` which is not available on nuget.org. To restore the dependency, you need to add the following package source to your project:
> ```bash
> https://pkgs.dev.azure.com/dnceng/public/_packaging/dotnet-tools/nuget/v3/index.json
> ```

```xml
<PackageReference Include="AutoGen.DotnetInteractive" />
```
Expand Down
Loading